The same principles can be applied to healthcare. But the concept of leveraging all the data available in a healthcare system to change long-held processes and improve results can seem overwhelming. Instead of eating the elephant, hospital leaders need to take a strategic approach. Where are those areas inside your hospital where you can get the biggest bang for the buck?
How do you start? There are multiple stages as it relates to data. It’s not unusual for healthcare organizations to be at different points on the maturity curve in taking advantage of it.
A strong business intelligence system that provides key dashboard metrics should give the management team the information they need to make decisions day-to-day. A good example is being able to look at a workforce productivity index.
For example, one of the more common financial metrics in supply chain is supply expense per adjusted admission, or supply expense per patient day. Both give a high-level look. But if your data is structured so that you can drill down one layer into expense drivers, you have better understanding of issues. We use four major categories—medical devices, pharmacy, commodities and blood—and then another layer of subcategories to provide even more insight. If you saw a significant change in chemotherapy, for example, are you treating more cancer patients? If so, you should expect an increase, but if not, you need to look harder to find out if it is a process problem, a pricing change issue or a technology gap. Depending on the problems, actions could include renegotiating contracts, changing supplies or even changing how you position staff at the point of care.
Operational measurements are another set of data that help highlight your processes. In revenue cycle, the number of claims denied is a common metric. But does your data reveal the root cause of the denial, allowing you to track and isolate problems upstream? If you can understand the cause of denials, often you can fix processes on the front end to reduce the number of denials you are dealing with on the back end. In workforce, you may look at hours of contract labor per department over time to identify where more attention is needed. Is there too much employee turnover in one area or recruiting problems in another?
No matter the division in the hospital, delivering up the right metrics with enough depth and accuracy can sometimes be a challenge. Usually it’s because the data is not standardized, coded the same or consistent enough across departments or across the same departments in a hospital system. If your data quality is poor—if the same thing is recorded or coded in several different ways—your system will not serve as the foundation you need to draw analysis, create benchmarks and take meaningful action.
If you build a strong data foundation, savings can be significant. A Parallon client recently saw a 15 percent improvement in supply expense per adjusted admission in one year after adopting a more rigorous data analysis system. Most organizations that move to a new stage in data and process analysis can see a savings of 3 to 5 percent of total supply expense.
Likewise, hospital systems that have implemented Parallon’s workforce productivity software, which included development of a productivity index based on their goals, have identified savings opportunities of 5 percent to 7 percent of total labor spend through better labor management.
Standardization is important. Our supply side item master contains 35,000 to 40,000 unique SKUs, but some supply chain operations have as many as 70,000—an indicator that an organization is consuming a lot of products and is not highly standardized. If you’re not standardized, you will have price and expense leakage. One client we served had 39 types of bed pans in their item master, not an expensive item but a good indication that if you are fragmented on something as simple as bed pans, you are also likely fragmented on high-cost items, like joints, and missing significant savings opportunities.
The ability to create reliable data and use it for business intelligence, including utilizing benchmark data to understand performance against peers and identify opportunities, is Phase 1 in the data world.
Where the concept of big data comes into play is Phase 2 through predictive analytics. Using data science to analyze large volumes of data to hone every aspect of clinical or financial processes has been well recognized to have tremendous potential to solve some of the most difficult problems in healthcare—how to drive down costs, how to better manage populations, how to improve outcomes.
But leveraging the power of big data may seem out of reach for many hospital organizations because of the need for large data sets, standardization of that data, a data scientist to write the proper algorithms, and developers to pull it together so it is actionable.
A strategic approach with big data can allow hospitals to start consuming the elephant with an eye to the most important results for the organization.
For example, a large health care system might look across its hospitals at knee and hip replacements, all the different procedures and the different vendors. Which ones have the fastest recovery times from a procedure and mix standpoint? Which have the most readmissions, the most infections? Such an analysis requires massive amounts of data that must be standardized, but at the end of the day, it will drive better results than exist now.
The biggest barrier for most companies is the amount of investment and time needed. Very few organizations today have undertaken predictive analysis because of the massive volumes of data needed. You need information not only from your current in-house system, but also historical data which may not exist in a reliable way. Another challenge is amassing the right data for certain analysis. Clinical and payer data has traditionally been housed by two different entities.
Still, large sets of data are starting to be amassed and leveraged.
One way hospitals can take advantage of it for improving business performance is through partnerships, enabling analysis with deeper and more detailed data. For example, a managed service, such as for supply chain, workforce or revenue cycle, often is working out of a much broader data set that is already standardized and they can leverage that intelligence for you. They have already developed the expertise and made the technological investments and are often farther along on the data consumption road.
For example, Parallon’s revenue cycle division is using data science to leverage its massive amounts of data on claims and readmissions to create more dynamic predictive metrics so they can detect potential problems with bills before they are sent out. One account, one bill, one full payment within a reasonable time frame is the goal. A cleaner billing process saves time in the appeals process and reduces financial losses.
A denial probability metric that can be applied to every bill is the next step in utilizing neural networks that are responsive to changes in regulations or payer trends.
In a neural network, your system learns and relearns as more information comes in, such as remits from payers. It uses pattern recognition to find new commonalities and trends. In essence, you are taking advantage of a dynamic data science feed upon which workflows can be figured. If a bill meets a certain threshold of denial probability, the bill is held to figure out the problem. If it’s below the threshold, it gets sent.
Similar “live” metrics using analysis of large amounts of unstructured data can measure data quality and data completeness on bills. We all know that better documentation with an account means everything downstream will go better. But we don’t have a score for that. Using natural language processing, however, we can look at images on 1,000 pages of medical records to determine whether it has, for example, the 10 critical elements that we need for an appeal if payment is denied. The denial probability may be thought to be low on a particular bill, but even if it is denied, is the data quality good enough to support an appeal? Likewise, is the data on an account complete before the bill is dropped? Building those dynamic metrics into tools used on the front line allows action to be taken at the right time to improve results.
The key for hospital leadership is to understand where they are on the data curve and what information would drive the changes most meaningful to their organization. Not unlike Beane, who knew his “pain points,” leaders need to identify their pain points and priorities and build a data system to support their goals. Then they need to leverage that data with the confidence that comes from a really good road map.