Ready to learn Big Data Analytics? Browse courses like Big Data – What Every Manager Needs to Know developed by industry thought leaders and Experfy in Harvard Innovation Lab.
It seems as if news about another hospital laying off workers – or worse, closing – hits the news every day; at last count, 21 hospitals stopped treating patients in 2016. Hospitals and healthcare systems across the country, including some of the largest, laid off thousands of employees. In November 2016 alone, 13 providers eliminated more than 1,000 jobs.
In September, HCCA’s Vice President of Marketing Kathleen Johnson said, “The actions taken recently are in accordance with our planned and ongoing strategic realignment of the hospital’s operations. We continue to streamline operations, improve efficiencies and healthcare delivery to our community.”
In October, UC Irvine Health spokesman John Murray said, “The layoffs are part of a multipronged approach to improving the health of the system. Before considering the layoffs, we looked for opportunities to reduce expenses as well as maximize potential revenue.”
What’s Going On?
Steep operating losses and severe budget shortfalls, forced by a rapidly changing landscape toward value-based reimbursements, are causing hospitals to consolidate operations as much as they can: merging operations, closing labs, reducing workers and finding every possible way to lower costs without affecting quality of care – all while continuing to meet the spiraling demand on the clinical side.
As the industry shifts to value-based care, hospitals and healthcare systems face increasing pressure to reduce their cost structures – a difficult challenge – without affecting quality of care. For many, consolidation is the next step in the progression. But after the merger is completed and redundancies eliminated, the big question is “How do we make the most out of all these resources?”
Simply put, hospitals are under constant pressure to do more with less – in many cases, much more with much less. Every day, they face an operational paradox; scarce resources are both overbooked and underutilized within the same day. This leads to several undesirable outcomes: millions of dollars of unnecessary operational costs, long patient wait times, overworked staff and an insatiable appetite for expanding existing facilities or constructing entirely new ones.
Data to the Rescue
Electronic health records (EHRs), fueled by government regulations and incentives, are opening opportunities for operational efficiencies. In the past decade, hospitals have invested hundreds of millions of dollars in EHRs, business intelligence and Lean/Six Sigma initiatives. Now, CIOs are asking “How can we leverage all this data to reduce costs?”
In a CIO survey conducted earlier this year, IT optimization ranked as one of the top 10 areas of focus. “Call it performance, call it tech-based enabled improvement, call it optimization. Call it whatever the buzzword is, but [we’re] trying to get more capability out of the investment we’ve made in the past few years in EHRs,” said Todd Hollowell, COO of Impact Advisors.
EHRs capture every bit of information related to a patient: historical conditions, medications, treatments, everything. The goal is to give that information to whoever needs it, whenever they need it. An obvious use case is to give everyone, from physicians to pharmacists, a valuable “situational awareness” that avoids obvious waste such as duplicate lab tests, unnecessary lab orders, etc. But that’s just the beginning. If you look at the enormous amount of data stored per patient (more than two terabytes), it is possible to build sophisticated interconnected systems that can provide much better situational awareness to everyone involved in the core patient pathways. One of the core ways to reduce costs is to maximize resource utilization and route care providers to the right place at the right time to reduce the length of stay and improve overall efficiency. Predictive analytics provides a scalable way to accomplish this.
Predictive Analytics Is the Answer
Historically, process improvement efforts in hospitals worked with small, historical snapshots of data from which the core operational issues were identified. From this, strategies were developed, implementation plans executed and the disciplines for continuous improvement were established. This was the best approach when all that was available was rear-view mirror data snapshots and Excel as the analytic engine of choice. Now, with the explosion of smart devices, computational power in the cloud, and the growing pervasiveness of data science and machine learning algorithms, an entirely different realm of operational optimization is suddenly possible.
Consider the following scenarios on how predictive analytics is already optimizing patient pathways within hospitals:
– Optimizing access to treatments such as chemotherapy: By looking at historical demand patterns and operational constraints, sophisticated forecasting algorithms can predict the daily volume and mix of patients and orchestrate appointment slots such that there are no “gaps” between treatments. This radically improves chair utilization, reduces patient wait times and decreases the overall cost of operations. Doing this without sophisticated data science is hard; for example, just arranging the order in which 70 patients can be slotted for their treatments in a 35-chair infusion center is a number exceeding 10#k8SjZc9Dxk100. Trying to solve this problem with pen, paper or Excel is a pointless exercise.
– Operating rooms are key resources within the hospital: Study after study shows that OR utilization at most large hospitals is at best 50-60 percent. In most hospitals, operating rooms are allocated to surgeons using “blocks” – for simplicity, the blocks are often either half-day or full-day. Even the most prolific and productive surgeons often don’t fully utilize the blocks they are given, and the process for reallocating blocks on a monthly basis or even for last-minute block swaps is cumbersome and manual. Using data science and machine learning, hospitals can monitor utilization, identify pockets for improvement, automatically reallocate underutilized blocks and improve overall operating room utilization. A 3-5 percent improvement in block utilization is worth $2 million per year for a surgical suite with just four operating rooms.
– Optimizing in-patient bed capacity utilization: In-patient bed capacity is a constraining bottleneck in most hospitals, yet virtually every hospital solves this problem with an arithmetic-based “huddle” approach that reviews the patient census from the overnight stay in each unit, adds known incoming patients, subtracts known discharges and then decides if the unit is flirting with the limits of its available capacity. This cycle repeats itself, often several times per day, with a planning horizon of the day at hand. On the other hand, Google completes the search bar while we are typing because it has analyzed millions of search terms similar to the one you are entering and automatically presents the four or five highest probability queries that you intend to submit. Imagine looking at each overnight patient, finding the 1,000 prior patients over the last two years who entered the hospital with a similar diagnostic or procedure code and reviewing their “flight path” through the hospital (i.e., number of days spent in each of the units prior to discharge); then, an aggregate probabilistic assessment of the likely occupancy of each unit could be developed. Not only would it provide a better answer for today, it would also help anticipate the evolving unit capacity situation over the next 5-7 days, leading to smarter operational decisions on transfers, elective surgery rescheduling, etc.
A similar machine learning approach can help orchestrate patient flows at clinics, labs, pharmacies and any unit within the hospital network that struggles with the operational paradox of being overbooked and underutilized at the same time.
Hospitals are starting to see gains from predictive analytics, especially on the operational side. OR block scheduling is an area where the ROI can be significant. UCHealth, the largest healthcare network in Colorado, deployed a mobile block swap system that lets surgeons request and release blocks via their smartphone. The system uses predictive analytics to identify blocks that can be swapped. Within three months of deployment, the network saw a 16 percent increase in utilization of the swapped blocks compared to average blocks. A “Smart Performance Tracker” continuously monitors OR utilization and produces actionable insights to make data-driven decisions about from whom to take block time away and to whom to give it, and a Smart Staffing module creates optimized monthly/quarterly OR staffing plans.
NewYork-Presbyterian’s NCI-designated cancer center deployed predictive analytics to shape patient demand. The system uses data science to mine historical appointment data, understand traffic patterns and “level load” patients throughout the day, leading to an astounding 55 percent lower wait times at peak hours, 40 percent lower wait times overall and 17 percent higher patient volumes. Several other hospitals, including Stanford Health Care, Wake Forest, UCSF and more have seen significant gains with predictive analytics for infusion scheduling.
It’s clear the challenges of 2016 will continue throughout 2017 and beyond. The role data, predictive analytics and machine learning can play in transforming how providers do more with less is equally clear.
Originally published at Big Data