Search

AI Meets MRI

Image courtesy of Noemi Jester.

A cancer patient lies patiently in the narrow tube of an MRI machine, the room humming with magnetic pulses. For many patients, the scariest part is not the scan itself, but the long wait that follows. Has the treatment been working? Is the tumor getting smaller? Somewhere down the hospital corridor, a radiologist sits in a dim room, eyes locked onto a glowing monitor, meticulously tracing the outline of a tumor on dozens—sometimes hundreds—of MRI slices.

“It can take up to four hours,” said Noemi Jester, postgraduate research fellow and lead author on a study out of the Yale Department of Orthopaedics and Rehabilitation 3D Tumor Lab.

The process, known as manual segmentation, is the standard for accurately measuring tumor volume from an MRI. It involves measuring and summing the area of tumor in each MRI slice to calculate the volume of the tumor. But it’s time-intensive and challenging to sustain for widespread clinical use. Previous efforts to speed up the process have used linear measurements—estimating tumor shape as an ellipsoid based on the longest diameter—but these analyses tend to be very inaccurate. That inaccuracy is particularly problematic for a type of tumor called vestibular schwannoma, a typically benign growth on the trigeminal nerve connecting the ear to the brain. The tumors present themselves quite differently from the spherical growths one may usually think of. The schwannomas are shaped similar to ice-cream cones, with a wide base but a narrow tip, leading linear approximations to grossly overestimate the size.

When growth occurs in the narrow areas of the tumor, it often goes underestimated. In other words, real changes in the tumor’s shape or size can be masked within the noise of an imprecise approximation. Physicians heavily rely on tumor volume to determine the tumor’s growth rate, and inaccurate or delayed volumetric analysis creates a pressing clinical challenge. For patients, the absence of accurate volume data can reduce their confidence in treatment and understanding of their condition.

To move away from linear measurements and to make the precision of manual volumetric analysis more accessible, the researchers at the 3D Tumor Lab have turned to artificial intelligence. Working with radiologists and computer scientists, Jester and her team trained a neural network to segment vestibular schwannomas automatically, comparing its output with manual segmentations.

“Vestibular schwannomas are a very characteristic type of brain tumor, growing along the same nerve within the auditory canal,” Jester said. To train the neural network, Jester and her team leveraged this specificity. The network uses pattern recognition to identify the general location of the tumor. MRI images are made up of varying densities based on fat and fluid content, and tumors have a distinct density compared to surrounding tissues. By detecting these density differences, the neural network can effectively segment the tumor from the non-tumor areas in each MRI slice. The training process involved optimizing the network using over a hundred MRI scans. 

The result? A remarkable match. There was a high level of similarity between the AI’s measurements and those done by hand—and each measurement took only about two minutes, over one hundred times faster than the current process.

Automating the manual volumetric analysis process doesn’t just save time—it reshapes the doctor-patient experience. “Right now, radiologists spend more time segmenting than analyzing,” Jester said. With AI handling the segmentation, radiologists and the rest of the patient’s physician team can focus on the bigger picture: how the tumor is behaving, whether treatment is working, and how best to plan the next steps.

Even more exciting, the AI generates 3D images of the tumor following segmentation, which can help patients visualize their tumor—something formerly buried in complex radiology reports. 

“The model creates a beautiful, intuitive rendering,” Jester said. “It helps patients understand where the tumor is and how it’s changing over time.” By observing the tumor structures in 3D, patients may gain a sense of empowerment and control, seeing more clearly what they are fighting against.

Now, imagine a patient walks out of the MRI room, and instead of waiting in uncertainty for forty-eight hours for an incomprehensible report and a week for an appointment with a physician to explain the situation, they are led directly into a physician’s office, where a 3D model of the tumor is already on the screen. With this innovation, the days of anticipation could soon be replaced by informed, empowered choices, creating a more efficient, patient-centered healthcare system.