Flagler Hospital had a problem that was pretty common. Only one-third of physicians at the 335-bed community hospital in St. Augustine, Fla., were regularly following order sets.
The pre-defined groups of clinical orders are meant to standardize care and reduce clinical variation.
While some variation is expected to personalize care, broad deviations from standards of care lead to unnecessary services, driving up the cost of treatment and the possibility of additional complications.
“Over the past few decades we’ve come to realize clinical variation plays an important part in the overuse of medical care and the waste that occurs in healthcare, making it more expensive than it should be,” said Dr. Michael Sanders, Flagler’s chief medical information officer. “Every time we’re adding something that adds cost, we have to make sure that we’re adding value.”
Clinical variation is a significant contributor to healthcare’s multibillion-dollar overuse problem. And it’s not new. A 2012 found that at least 20% of clinical care spending could be reduced without affecting health outcomes. Even within the same hospital, higher-spending physicians don’t tend to see better patient outcomes, with a of 22,000 physicians across nearly 3,000 hospitals not finding a link between physician spending and lower patient mortality or readmissions.
This has become particularly problematic in the era of value-based care, where hospital reimbursement is increasingly tied to a facility’s ability to improve patients’ health outcomes while keeping costs and readmissions down.
Enter machine learning, artificial intelligence and data analytics.
“Fundamentally, what these technologies do is help us recognize important patterns in the data,” said Dr. Douglas Fridsma, CEO of the American Medical Informatics Association.
That’s the case at Flagler, where a year-old AI program has dramatically improved adherence to order sets. Last year, the hospital began uploading data from its electronic health record, billing, surgical, analytics and enterprise data warehouse systems databases into a service from AI software vendor Ayasdi. The company applied machine learning to analyze the information and spot trends.
The application grouped similar patients with similar care paths. Hospital staff can review these groups to determine which care paths—including not only what services patients received, but also in what sequence—led to the best patient outcomes, coupled with cost savings. Staff can subsequently create order sets based on these events.
Flagler first targeted pneumonia, one of the conditions for which hospitals can see Medicare reimbursements drop due to high readmission rates.
“At the time, I thought pneumonia was a simple, straightforward disease,” Sanders said. “It gave us time to learn the tool before we hit the more complicated processes like septic shock and congestive heart failure … and time to prove to our board that this new-fangled tool was actually worth it.”
Using the AI application from Ayasdi, Flagler determined, for instance, that when a pneumonia patient also has chronic obstructive pulmonary disease, it’s helpful to start nebulizer treatments as early as possible. By showing clinicians the data, Sanders said the hospital boosted adherence with order sets to nearly 80%. And that has paid off, saving $1,350 per pneumonia patient and reducing the readmission rate from 2.9 to 0.4%.
Flagler is rolling the program out to COPD, heart failure and sepsis patients. The hospital relies on its physician IT group, or what Sanders calls the “PIT crew,” to develop the order sets. The crew is made up one physician from every department. Sanders founded the group nearly 10 years ago to bolster the hospital’s EHR efforts. Since then, the physicians have helped hospital leadership make decisions on various IT workflows.
“I don’t think I can emphasize this enough, but it is so important to bring them in at the beginning so they understand what this process is,” he said.
Alongside physician support, AI projects ultimately rest on the quality of the data they use, Fridsma said, and that’s something hospitals should be cognizant of from the beginning. Bad data will result in bad patterns, he said.
Flagler used its own historical data, rather than adapting a project from a another organization with a different patient mix. Hospitals using outside data need to dig deep into where it came from, as well as how variables, such as medical services and patient outcomes, are defined.
“This (new order set) is not coming down to us from on high, from Mayo Clinic or Intermountain Project Japan, or another group that’s saying, ‘Here’s a good thing we’ve found,’ ” Sanders said. “We’re using our own data and saying, ‘Here’s what we’ve done, here’s where we fell short and let’s fix that.’ ”