Stat counter


View My Stats

Saturday, January 10, 2015

Needing an industrial revolution in healthcare

I recently finished a Teaching Company course on the Industrial Revolution, taught by Dr. Patrick Allitt. In this course he detailed the development of industrialization of manufacturing over the course of almost 500 years. It is a remarkable story of human ingenuity, technological progress, and extraordinary impact on the human condition. He presents a story of a world not so long ago where human existence was precarious because the tools and systems in place to meet human needs were rudimentary. The remarkable development of industrial technologies in conjunction with effective approaches to motivate and coordinate human activity resulted in a complete transformation of human existence where these tools were deployed.

The progress did not take place in a straight line. There were failures and the achievements made were the product of much tinkering and some planning. The exact outcomes were not necessarily predictable but the themes in retrospect were recurrent. At each step of the way, huge breakthroughs happened when tools were developed that freed humans from manual tasks and automated activities. These breakthroughs were almost always very disruptive of selected industries and populations but the net effects for the broader population were hugely positive. People got more to eat at lower prices as well as better living and working conditions.

Another aspect of this evolution was the ability of the industrial revolution to deliver not only more, but more at a lower price and higher quality. Nowhere is this more evident than in the 20th century where we observed an explosion of increasingly sophisticated consumer goods where after particular product introductions prices would tumble while simultaneously the functionality and quality increased. This includes computers, home appliances, and automobiles. What drove this remarkable expansion of high quality plenty was a combination of science, breakthroughs in human organization, and information systems which could track key elements of cost and quality.

Dr. Allitt tracks all of these elements going back to their origins, which he tracks back to the building and operation of large ships in England. What does this have to do with medicine and biomedical research? Simply that the health care delivery and research relating to health care benefits from the same tools which made the industrial revolution possible. The holy grail for basically any human activity is to get more out of any particular inputs in terms of making human lives better. It has been well documented that, unlike the productivity gains associated with virtually all commercial activities in the US since WWII, health care has shown essentially no productivity gains. As noted by by Kochner, et all in the NEJM (N Engl J Med 2011; 365:1370-1372October 13, 2011DOI: 10.1056/NEJMp1109649):

Of the $2.6 trillion spent in 2010 on health care in the United States, 56% consisted of wages for health care workers. Labor is by far the largest category of expense: health care, as it is designed and delivered today, is very labor-intensive. The 16.4 million U.S. health care employees represented 11.8% of the total employed labor force in 2010. Yet unlike virtually all other sectors of the U.S. economy, health care has experienced no gains over the past 20 years in labor productivity, defined as output per worker (in health care, the “output” is the volume of activity — including all encounters, tests, treatments, and surgeries — per unit of cost). Although it is possible that some gains in quality have been achieved that are not reflected in productivity gains, it's striking that health care is not experiencing anything near the gains achieved in other sectors. At the same time, health care labor is becoming more expensive more quickly than other types of labor. Even through the recession, when wages fell in other sectors, health care wages grew at a compounded annual rate of 3.4% from 2005 to 2010.
 In addition, it is very difficult to demonstrate gains in quality as well. I contend that these two shortfalls are in fact related to the inadequacy in information systems used in health care. When you can't reliably track things, you end up wasting resources and generating poor quality products.

The story in biomedical research is perhaps a bit more nuanced. Biomedical research has revolutionized diagnostics and therapeutics but we are facing issues of diminishing returns. The antibiotic and vaccine revolution deployed in the early to mid 20th century made huge impacts at very modest cost. Arguably the returns of these endeavors dwarfed the costs of development and deployment. We are at a different point now where the cost of new drugs to treat chronic diseases is simply off the charts. Every new decade moves the decimal place over one place. Furthermore, the deliverables for individual researchers is not necessarily something that has an impact on people in the near term. The funding system values a different sort of productivity based upon publications. The explosion of scientific publishing has until recently, not been accompanied by any real change in the mechanics of vetting and review and the problem which has arisen is one of quality control.

However, funding agencies have attempted to develop models to oversee quality and independent entities such as Retraction Watch (http://retractionwatch.com/) have stepped up to inject quality control in the process not adequately addressed with peer review. Not surprisingly there has been an explosion of paper retractions as well as exposees revealing major issues with reproducibility (Nature article). This is very disturbing because what separates science from magic and alchemy is the ability to reproduce results.

What all of these contemporary processes have in common is they are very dependent upon human beings using manual processes and judgement requiring much subjectivity to do their jobs. Furthermore, the products of these efforts cannot be readily and consistently assessed for quality. As long as these remains the case in health care delivery and biomedical research, it will be hard to reap the sort of gains from investment in these sectors when compared to investments made in areas where productivity and quality can be assessed more robustly.

2 comments:

CAM said...

From a clinical perspective a revolution in healthcare is only likely to come about with major breakthroughs in medical science that can be translated into effective and cost efficient treatment. Until then it is going to be a labor intensive industry, and one that requires skilled labor at that. There is unlikely to be a Henry Ford coming along with an assembly line idea that will revolutionize work flow. Counterintuitively the advent of computers in medicine has only compounded the problem, in large part because government diktats and reimbursement issues have driven the form and function of their application.

On the research side, unlike the early history of modern medicine most of the low hanging fruit has been picked. Not that there won't be dramatic discoveries, but even those will likely result in expensive therapeutics (e.g. rituximab, imatinib). These will be a lot slower in coming and we are spoiled by the rapidity of medical progress over merely the past 50-75 years and the perception that that rate should continue. However progress in medical research is also a matter of national priority. As Tom Coburn pointed out in a recent editorial in the Wall Street Journal:

"...the entire Apollo program—all 17 missions and six lunar landings—cost about $108 billion in today’s dollars. Back on Earth, $100 billion could fund the National Cancer Institute for 20 years."

That trend continues he notes: "U.S. taxpayers have shelled out $75 billion to operate the ISS [international space station] since 1994, according to government estimates. NASA predicts it will cost another $21 billion before 2020—a projection its inspector general called “understated” and “overly optimistic” in a September audit."

That kind of money could pay for a lot of expensive medications and labor intensive care back on earth until we realize that industrial revolution in healthcare.

NMollanazar said...

Fantastic post. Without a doubt, we are doing a terrible job as a profession in utilizing information systems. I think the biggest hurdle is administrative. I have been in meetings with high-level administrators (merely as an observer), where the discussion is focused on ways to limit clinician access to data from the information systems.

I firmly believe that with centralized and streamlined access to clinical data, we will begin to see disruptive innovation akin to that seen during the Industrial Revolution. That is why I am a big advocate of the PCORI initiative. It is also why I am a proponent of systems that automatically de-identifies data from the EMR and periodically (preferably weekly or even daily) dumps said data into a database that can then be accessed by all physicians and used for research purposes. Physicians used to be able to tinker and experiment and track those changes themselves, we’ve lost that ability somewhere along the way.

I also feel like part of the problem is that we are afraid to accept the truth – a lot of what is being produced in the healthcare setting is utter garbage. Many of the notes in my institution’s EMR are meaningless and wasteful. Unfortunately, the same thing applies to some published research – while there is a lot of amazing research, there is perhaps an equal amount of questionable studies being published on a daily basis. Given this reality, I am very interested in the SHARPn project – the idea that a system could be created and taught to standardized notes in the EMR for secondary use is novel, and I think may even prove disruptive.

All that said, in response to the previous comment, I don’t think its fair to say we should rob Peter to pay Paul until we figure things out. Space exploration is an important endeavor, if at the very least from the vantage that it instills nationalism and pride in the American people. While expensive, myriad technologies and products have been brought to market after being created for use by NASA – another important thing to keep in mind.