INDUSTRY FOCUS

The Future of Interactive, AI-Assisted Radiology Reports

Sunday, December 1, 2024

By Mary Henderson

Advancing imaging AI for health care, radiology and radiologists

Richard J. Bruce, MD
Bruce
Matthew Lungren, MD, MPH
Lungren

What began as a discussion about the future possibilities of AI-assisted radiology documentation is about to become reality, thanks to a collaboration between Microsoft and the University of Wisconsin-Madison (UW).

The academic-industry partnership is nearing completion of a model that produces an interactive, modular draft report on chest X-ray images, allowing for use in both  academic and clinical settings as an  editable note. 

“When ChatGPT debuted, Microsoft gave a small group of radiologists an internal preview of some things they were starting to think about,” explained Richard J. Bruce, MD, radiology vice chair of informatics at UW. “One example was an AI model that analyzes a chest X-ray and creates a report that the radiologist could then interact with, asking for more information and treatment recommendations. The enthusiasm was off the charts.”

To finalize the project, Microsoft partnered with UW Radiology to help validate and integrate the model. 

“There’s no way we as a university could do something like this on our own,” Dr. Bruce said. “I believe great innovation happens when you work together.”

While AI models like ChatGPT work well with text, they have limited knowledge of radiological images and are prone to fabrication. To solve this challenge, the team decided to take a modular approach based Lungren on work completed by Microsoft Research. 

New health care AI models in the Azure AI model catalog allow the health and life sciences ecosystem to use foundation models to help them rapidly build and deploy AI solutions tailored to their specific needs, all while minimizing the extensive computer and data requirements typically associated with building multimodal models from scratch. 

“Rather than building what we would call a giant ‘god’ model that's so big and bulky it can only be hosted in one place, we decided to break it down into modular components that are open source and not limited to only research use,” said Matthew Lungren, MD, MPH, chief scientific officer, Microsoft Health and Life Sciences.

The more lightweight, embedding model processes medical images in a way that is understandable to a large language model so it can then perform specific tasks—such as reading an image and writing a report.

Over the last year, Microsoft has been working with UW Radiology and its database of 2.5 million chest X-rays to fine-tune the model, which Dr. Lungren hopes will one day be the DAX™ Copilot equivalent for radiology.

Now in 500 outpatient settings, DAX Copilot captures patient-physician conversations and converts them into clinical documentation summaries to be edited by the clinician. 

“Our thesis was, if you can take a transcript and turn that into an editable note, why can’t you do the same for an image? What if all I must do is look at the image and note, edit and sign it, just as if it came from a resident?” Dr. Lungren said.  “Could this technology also have an impact on burnout and workforce shortages in the specialty?”

If successful, Dr. Bruce believes that this is the very tip of the iceberg of potential applications in radiology.

“As we get better and better models, and if you can interact with them in more facile ways, I think we’ll continue to identify ways to talk with the model to get even more information,” he said. “This has the potential to dramatically change the efficiency of radiologists’ workflows.” 

Visit Microsoft at Booth 1311 during RSNA 2024 to see product demonstrations and learn more about imaging AI solutions.