RSNA2023 Leading Through Change
Daily Bulletin

Generative AI May Improve Radiology Report Clarity

Thursday, Nov. 30, 2023

By Richard Dargan

Large language models can improve the readability of radiology reports by simplifying medical jargon and eliminating unnecessary words, according to research presented Wednesday.


The increasing complexity of imaging and the trend toward higher-level structured reporting have made radiology reports difficult for referring clinicians and patients to comprehend. The recent advances of large language models provide an opportunity to address this problem, said study lead authors Ghulam Rasool, PhD, and senior radiologist Les Folio, DO, MPH, from the Moffitt Cancer Center in Tampa, FL.

"The timing is right to apply evolving large language models to improve the conciseness and structure of radiologist's reports that tend to be verbose, often with unnecessary language that does not contribute to the clinical question," Dr. Rasool said.

For the study, Dr. Rasool and Dr. Folio used GPT-4 to improve the signal-to-noise ratio (SNR) in radiologist reports. "Signal" is this context refers to content contributing to the communication, while "noise" means unnecessary words that do not help convey meaning.

"Our goal is to maximize signal-to-noise ratio, much like a radio signal that is easier to hear and understand," Dr. Rasool said.

Using Prompts to Remove Redundant Text

The researchers first prompted GPT-4 to remove redundant words and information not useful for downstream diagnosis, treatment planning and patient reporting. Among the removed words and phrases were "there is" and "at this time." Then they prompted GPT to convert the higher signal-to-noise-ratio report text to active voice plain English in a manner for general public understanding.

Content signal-to-noise ratio was doubled following the removal of unnecessary words while maintaining meaning for physicians in a more inviting and easier-to-read structured format.

For example, GPT cut a 37-word block of text filled with medical jargon related to a kidney stone down to two brief, easy-to-understand sentences.

"We leveraged prompt engineering best practices on already trained AI models to optimize large language models' output, resulting in reports that are less than half of the original word count," Dr. Rasool said.

Initial comparison of the resultant shorter reports with more "signal" demonstrated ease of understanding by referring providers, Dr. Rasool said. The improved signal-to-noise ratio of the radiology report also has the potential to improve radiology's service to patients.

"Though we have not tested the patient's perspective on improved understanding of more concise reports, we believe further study and additional tools to include interactive multimedia reports will help patients understand radiology reports with less noise," Dr. Rasool said.

While there are significant challenges toward implementing more concise and structured radiologist reports in practices, Dr. Rasool is optimistic that the new research provides a path forward.

"We believe our iterative approach, starting with continuous sampling report signal-to-noise ratio in the background, will pave the path for other centers to follow in quest of the ideal concise report that is easily digestible by providers and patients alike," he said. "We are now anonymously sampling volunteer radiologists' reports as part of a quality measure and sharing with other university medical centers that also would like to do this sampling."

Access the presentation, "Toward Patient-consumable Radiology Reports—Improving Content Signal-to-Noise Ratio While Converting Medical Jargon to Plain English via GPT-4," (M5B-SPIN) on demand at