AI Algorithm Reads Ultrasound Images from Hand-Held Devices and Smartphone

By MedImaging International staff writers
Posted on 31 Mar 2022

Ultrasound technology is becoming more portable and more affordable. However, up to half of all birthing parents in developing countries are not screened while pregnant because existing hand-held devices require a trained technician to precisely manipulate the ultrasound probe to capture the right images. In addition, the image has to be interpreted by a radiologist or specially trained obstetrician who are limited in many underserved communities and developing countries. That is where artificial intelligence (AI) comes in.

Northwestern University (Evanston, IL, USA) and Google Health (Menlo Park, CA, USA) are collaborating on a project to bring fetal ultrasound to developing countries by combining AI, low-cost hand-held ultrasound devices and a smartphone. The project will develop algorithms enabling AI to read ultrasound images from these devices taken by lightly trained community health workers and even pregnant people at home, with the aim of assessing the wellness of both the birthing parent and baby. Raw ultrasound images will be sent to a smartphone, where AI will distinguish critical features like fetal age and position. The low-cost device will take the image, send it to the smartphone and then the AI will provide a read on factors like fetal age and position. Then Google Health can develop an AI that will do the fetal interpretation.


Image: Researchers are teaching AI to read fetal ultrasound to identify high-risk patients (Photo courtesy of Northwestern University)

In the first step to developing the algorithms, the researchers will conduct research with pregnant patients in which they will perform their own ultrasound with a low-cost hand-held device. Northwestern technicians also will perform fetal ultrasounds on patients, and even family members will participate. The patients will then have a regular clinical fetal ultrasound. All the images and other pregnancy-related data will be downloaded into a database.

Study participants will use handheld ultrasound devices that have been pre-installed with Google Health’s custom application to collect, process and deliver the fetal ultrasound “blind sweeps.” “Blind-sweep” ultrasounds consist of six freehand ultrasound sweeps across the abdomen to generate a computer image. The goal is to collect a broad set of data and related information including reports on fetal-growth restriction, placental location, gestational age and other relevant conditions and risk factors. Data will be gathered across all three trimesters and from a diverse representative group of patients. The study will collect ultrasound images from several thousand patients over the next year. The AI will receive professional and amateur images across the many conditions that physicians typically want to monitor such as the age of the fetus and whether it has a heart defect. By having the side-by-side image captures, the AI can adapt to interpret the amateur image capture and learn to interpret them more accurately.

“We want to make high-quality fetal ultrasound as easy as taking your temperature,” said Dr. Mozziyar Etemadi, assistant professor of anesthesiology at Northwestern University Feinberg School of Medicine and leader of the project at Northwestern. “The real power of this AI tool will be to allow for earlier triaging of care, so a lightly trained community health provider can conduct scans of birthing parents. The patients don’t have to go to the city to get it. The AI will help inform what to do next – if the patient is OK or they need to go to a higher level of care. We really believe this will save the lives of a lot of birthing parents and babies.”

Related Links:
Northwestern University 
Google Health 


Latest Ultrasound News