UChicago Medicine grows into role of AI innovator
AI beats expectations in revenue cycle and operational performance at city's South Side academic medical center.
At University of Chicago Medicine, leaders assessed the health system’s progress toward AI adoption expecting the worst — before discovering the organization may well be one of the field’s best users.
Two years into their deployment of AI, leaders at the system, which goes by UChicago Medicine, could have pointed to some solid wins in adapting the technology to its operations ahead of a self-evaluation, especially in revenue cycle automation, clinical documentation improvement and operational efficiency — areas where the organization was making visible progress.
But there were also times when progress felt “frustratingly slow,” as one executive described it — and the team yearned for validation.
“We looked around, and we had some concerns that we might not be adopting AI at the pace we want to be, knowing the promise of AI for healthcare and how fast the technology is changing,” said Ivan Samstein, executive vice president and CFO, UChicago Medicine.
The UChicago Medicine assessment had two main goals.
“We believed we’d gain greater clarity as well as a living, breathing document for AI innovation,” Samstein said. “Our view is that you need to have a road map of where you’re going; otherwise, it’s kind of hard to envision where you need to be.”
Taking stock
UChicago Medicine invited its leadership team to spend a day last spring evaluating its efforts to build a foundation for AI innovation. Samstein describes the discussions as “management team-led,” although insight from two consulting firms provided a basis for assessing the health system’s progress.
Ultimately, UChicago Medicine discovered it fares well in AI maturity by:
- Using AI to strengthen business outcomes
- Supporting a culture of AI innovation
- Implementing a governance framework to manage and guide its efforts
- Keeping focus on both measuring and optimizing performance and applying AI successes across the organization
These are capabilities that take AI innovation beyond an aspirational vision toward transformational success, as AI maturity models developed by Gartner, Deloitte, Accenture and others indicate.
“For us, the real key has been our ability to identify successes in early adoptions across the institution and leverage those at scale,” Samstein said. “As we understand it, we’re slightly ahead of other healthcare institutions on our use of AI for revenue cycle, and we see some additional opportunities there.”
Of course, they are going to keep moving.
“You could easily be ahead of the pack today and behind the pack tomorrow if you’re not staying focused,” Samstein said. “That’s why it’s critical to have a road map for innovation that is being tracked and thought about, a governance structure that speeds execution, and the readiness to take stock of the best use cases happening in your organization.”
Early days for most
“If organizations are simply pursuing an AI strategy, that is a noble goal,” said Erik Swanson, senior vice president, data science and analytics, Kaufman Hall. “But unless the adoption and advancement of AI is done in the pursuit of solving business goals and objectives, those efforts are a little misguided.
“Generally, what we’re seeing is that most healthcare organizations are not particularly far along in their analytical maturity,” he said. “Those that are adopting AI and advanced analytics with an eye toward addressing the business challenges they are facing are making relatively quick strides as they move through that continuum.”
What does it take to achieve AI maturity in healthcare? UChicago Medicine’s experience and insight from healthcare experts point to three key considerations.
1 Establish a road map for AI value — and revisit it, as needed
AI adoption in healthcare isn’t a race to the finish, implementing as many technologies as you can faster than your competitors. It’s about supporting the organization’s business needs and strategy.
“If I had one defining piece of advice around AI adoption, it is to focus on the organization’s business objectives and see where AI could help support those objectives rather than the other way around,” Swanson said.
That was a key takeaway UChicago Medicine leaders gained from their AI assessment.
“It’s critical to have a road map for adoption that is centered on your business goals,” Samstein said. “That road map may be a living document — one that can be updated as needed — but it’s vital in ensuring there is a launching-pad vision that is being tracked and thought about throughout the AI journey.”
At UChicago Medicine, efforts to embed AI-powered tools into clinical and operational workflows initially stemmed from a desire to enable staff to do more with less, given workforce shortages — especially in entry-level revenue cycle positions — and rising levels of burnout nationally.
“We’re consistently bringing on new technologies that can help streamline the efficiency of our providers and our clinicians and help make the back-end revenue cycle function run a bit more smoothly — and some of those objectives do have overlap,” said Jason Tolbert, executive director, clinical applications for UChicago Medicine.
Use of robotic process automation helps revenue cycle teams more effectively manage rising rates of initial denials. It also automates administrative tasks like claims status checks.
“In revenue cycle, it’s not about managing people. It’s managing information,” said Nicole Fountain, vice president, revenue cycle, UChicago Medicine. “We’re never going to have enough people to get to all the work. We have to figure out how to work smarter, not harder, and that means figuring out how to leverage technology to do the work for us. It’s also about knowing when to pull what levers. You can’t just use one type of AI solution and expect it to solve all your problems. You also can’t automate bad processes and expect to achieve the desired results. It’s a work in progress.”
Two years ago, UChicago Medicine began its AI journey on the operations side with five pilot projects, including one in revenue cycle. Over the past year, the health system has added clinical documentation improvement (CDI) to the mix.
“We have teams of data scientists and wanted to take a logical approach to looking at all of the discrete data we have in patient charts and say, ‘What inferences can we make? How can we prioritize work for the CDI team?’” said Mary Kate Selling, MHA, chief analytics officer at UChicago Medicine.
“We don’t have the structure to staff people to look at every single patient chart to make sure the complexity is fully captured, but we could leverage the AI in the data science to say, ‘Hey, there’s evidence of this. Look here; start here,’” Selling said. “We were really able to move the needle in capturing everything from case mix index to expected mortalities and point to specific evidence showing the complexity of the patients we care for.”
The advances made in CDI have strengthened payer contract negotiations. They have also empowered the clinical team to use prediction tools within the EHR to prioritize care interventions for those at risk.
“The CDI team touches more patients because AI narrows their focus to the risk factors that matter most,” Selling said. “They’re able to review more cases, faster.”
2 Build a robust data governance function with clinical representation
A robust data governance team will increase AI agility and hasten execution. It will also ensure the organization is making the right decisions around where to invest and that the right people are evaluating the results.
UChicago Medicine didn’t begin its AI journey with AI governance in mind, although the health system does have a strong IT governance structure. However, as the organization began to accelerate adoption of AI tools and solutions, leaders quickly understood the value of AI-specific governance, which the health system undertakes through its IT governance team.
“What you don’t want is to be so organized around AI governance that it slows down organic pushes for AI from different pockets of the organization, but you also want to make sure that someone’s got the overall vision for where you want to go and how to get there,” Samstein said.
Key to success at UChicago: representation from the service lines and departments affected by new innovations.
“It’s imperative that you have oversight of all of the AI projects initiated, with input from the operational owners that have a vested interest in the success of that product,” Tolbert said.
“We always have more project requests than we have time for,” he said.”With an interdisciplinary data governance committee, we’re able to prioritize projects together in a way that makes sense, holistically, for the entire portfolio,” Tolbert said. “This has improved our success rate.”
One way to measure the analytical maturity of a healthcare organization is to consider not just its ability to use analytics to describe what has occurred and where, but also whether there are elements such as an effective data governance model and an enterprise data warehouse that allows organizations to establish a single source of truth, said Swanson of Kaufman Hall.
“That’s your table stakes, if you will, in the maturity spectrum,” he said.
From there, analytic maturity increases according to the depth of an organization’s predictive capabilities — not only predicting what will happen, but also recommending action steps in response.
“When I look at maturity from that lens, most healthcare organizations lag other industries pretty considerably relative to the adoption of AI and advanced analytics,” Swanson said. “Academic medical centers tend to be further along than short-term, acute care hospitals, although those organizations are certainly adopting AI and advanced analytics to some degree.”
Risk of failure will be intensified without best-practice data for governance around how AI will be used, which data will feed AI tools and how these tools will be trained.
“I think many organizations are grappling with these issues in developing their AI governance,” Swanson said.
3 Emphasize validation of AI tools
“Validation is key,” Tolbert said. “Whether it’s an operational or clinical person, our validation process is very rigorous. We make sure we’re testing for all types of bias and that we’re not inserting bias into our system by deploying something that is untested and hasn’t been validated.”
For clinical AI initiatives, UChicago Medicine’s analyst intervention unit partners with the health system’s IT department and, where appropriate, the organization’s EHR vendor to validate a new tool. AI governance teams meet regularly — some monthly, some every six weeks — to review performance. Then, a yearly review is performed.
“This is really . making sure the results are as accurate as the interventionist needs them to be,” Selling said. “If anything is flagged during this process, we re-escalate the tool for additional valuation so we can make sure it’s safe for that patient population before it is expanded to other hospitals and locations.”
Results of nonclinical AI initiatives are reviewed during regular meetings as well to determine whether the expected results are being achieved and whether a change in approach is needed.