Researchers Use AI to Improve Mammogram Interpretation

By MedImaging International staff writers
Posted on 04 Jul 2018
A team of researchers at the Department of Energy’s Oak Ridge National Laboratory (Oak Ridge, TN, USA) successfully used artificial intelligence to improve understanding of the cognitive processes involved in image interpretation. Their work, which was published in the Journal of Medical Imaging, will help reduce errors in the analyses of diagnostic images by health professionals and has the potential to improve health outcomes for women affected by breast cancer.

Early detection of breast cancer is critical for effective treatment, which requires accurate interpretation of a patient’s mammogram. The ORNL-led team of researchers found that analyses of mammograms by radiologists were significantly influenced by context bias, or the radiologist’s previous diagnostic experiences. New radiology trainees were most susceptible to the phenomenon, although even more experienced radiologists fall victim to some degree, according to the researchers.

Image: Researchers used AI to improve mammogram image interpretation (Photo courtesy of the Department of Energy’s Oak Ridge National Laboratory).

The researchers designed an experiment aimed at following the eye movements of radiologists at various skill levels to better understand the context bias involved in their individual interpretations of the images. The experiment followed the eye movements of three board certified radiologists and seven radiology residents as they analyzed 100 mammographic studies from the University of South Florida’s Digital Database for Screening Mammography. The 400 images, representing a mix of cancer, no cancer, and cases that mimicked cancer but were benign, were specifically selected to cover a range of cases similar to that found in a clinical setting.

The participants, who were grouped by levels of experience and had no prior knowledge of what was contained in the individual X-rays, were outfitted with a head-mounted eye-tracking device designed to record their “raw gaze data,” which characterized their overall visual behavior. The study also recorded the participants’ diagnostic decisions via the location of suspicious findings along with their characteristics according to the BI-RADS lexicon, the radiologists’ reporting scheme for mammograms. By computing a measure known as a fractal dimension on the individual participants’ scan path (map of eye movements) and performing a series of statistical calculations, the researchers were able to discern how the eye movements of the participants differed from mammogram to mammogram. They also calculated the deviation in the context of the different image categories, such as images that show cancer and those that may be easier or more difficult to decipher.

In order to effectively track the participants’ eye movements, the researchers had to employ real-time sensor data, which logs nearly every movement of the participants’ eyes. However, with 10 observers interpreting 100 cases, the data soon began adding up, making it impractical to manage such a data-intensive task manually. This made the researchers turn to artificial intelligence to help them efficiently and effectively make sense of the results. Using ORNL’s Titan supercomputer, the researchers were able to rapidly train the deep learning models required to make sense of the large datasets. While similar studies in the past have used aggregation methods to make sense of the enormous data sets, the team of researchers at ORNL processed the full data sequence, a critical task as over time this sequence revealed differentiations in the eye paths of the participants as they analyzed the various mammograms.

In a related paper published in the Journal of Human Performance in Extreme Environments, the researchers demonstrated how convolutional neural networks, a type of artificial intelligence commonly applied to the analysis of images, significantly outperformed other methods, such as deep neural networks and deep belief networks, in parsing the eye tracking data and, by extension, validating the experiment as a means to measure context bias. Furthermore, while the experiment focused on radiology, the resulting data drove home the need for “intelligent interfaces and decision support systems” to assist human performance across a range of complex tasks including air-traffic control and battlefield management.

While machines are unlikely to replace radiologists (or other humans involved in rapid, high-impact decision-making) any time soon, they do hold enormous potential to assist health professionals and other decision makers in reducing errors due to phenomena such as context bias, according to Gina Tourassi, team lead and director of ORNL’s Health Data Science Institute. “These findings will be critical in the future training of medical professionals to reduce errors in the interpretations of diagnostic imaging. These studies will inform human/computer interactions, going forward as we use artificial intelligence to augment and improve human performance,” said Tourassi.

Related Links:
Oak Ridge National Laboratory


Latest Industry News News