Ultrasound Helmet Designed for Live Images and Brain-Machine Interface
By MedImaging International staff writers
Posted on 21 May 2018
A team of researchers from Vanderbilt University (Nashville, TN, USA) plan to create a brain-machine interface using an ultrasound helmet and electroencephalogram (EEG) that would allow doctors to view images as clear as those of the heart or womb. By using ultrasound technology for the brain, doctors could not only view real-time images during surgery, but also gain a better understanding of which areas become stimulated by certain feelings or actions. The ultrasound helmet would ultimately provide an effective way for people to control software and robotics by thinking about it.Posted on 21 May 2018
Ultrasound beams bouncing around inside the skull make it practically impossible to view any useful imagery. Brett Byram, assistant professor of biomedical engineering at Vanderbilt University, plans to use machine learning that will gradually be able to account for the distortion and deliver workable images. Byram also plans to integrate EEG technology to allow doctors to view brain perfusion—how blood flow correlates to changes in thought—as well as the areas of stimulation related to movement and emotion. Byram will use his new USD 550,000 grant received from the National Science Foundation Faculty Early Career Development to develop the helmet, working alongside Leon Bellan, assistant professor of mechanical engineering and biomedical engineering, and Michael Miga, Harvie Branscomb Professor and professor of biomedical engineering, radiology and neurological surgery.
The researchers believe that the applications of such an ultrasound helmet could be endless. For instance, a person with limited movement due to ALS could simply think about wanting a glass of water, making a robotic arm to retrieve one as the helmet would detect blood flow and EEG information that told it to do so.
“The goal is to create a brain-machine interface using an ultrasound helmet and EEG,” said Byram. “A lot of the technology we’re using now wasn’t available when people were working on this 20 or 30 years ago. Deep neural networks and machine learning have become popular, and our group is the first to show how you can use those for ultrasound beamforming.”
Related Links:
Vanderbilt University