Augmented-Reality Enhances Surgical Navigation Technology
By MedImaging International staff writers Posted on 25 Jan 2017 |
Image: Augmented-reality surgical navigation technology enhances X-ray guidance (Photo courtesy of Philips Healthcare).
A combination of three-dimensional (3D) X-ray and advanced optics provides surgeons with augmented-reality imaging during spine, cranial, and trauma surgical procedures.
The Royal Philips augmented-reality surgical navigation technology is designed to help surgeons perform image-guided open and minimally invasive spine surgery in a hybrid operating room (OR). Due to the inherently reduced visibility of the spine and other structures during such procedures, surgeons rely on real-time imaging and navigation solutions in order to guide their surgical tools and implants. The same is true for minimally invasive cranial surgery and surgery on complex trauma fractures.
The new Philips technology adds additional capabilities to the company’s low-dose X-ray system, using high-resolution optical cameras mounted on the X-ray flat panel detector (FPD) to image the surface of the patient. It then combines the external view captured by the cameras and the internal 3D view of acquired by the X-ray system to construct an augmented-reality view of the patient’s external and internal anatomy, improving procedural planning, surgical tool navigation, and implant accuracy, as well as reducing procedure times.
The first pre-clinical study on the technology was undertaken as a collaboration between Philips, Karolinska University Hospital, Cincinnati Children's Hospital Medical Center, and eight other medical centers. The results showed the technology to be significantly better with respect to overall accuracy (85%), when compared to pedicle screw placement without the technology (64%). The study was published in the November 2016 issue of Spine.
“This new technology allows us to intraoperatively make a high-resolution 3D image of the patient’s spine, plan the optimal device path, and subsequently place pedicle screws using the system’s fully-automatic augmented-reality navigation,” said study co-author Stefan Skúlason, MD, of Landspitali University Hospital (Reykjavik, Iceland). “We can also check the overall result in 3D in the OR without the need to move the patient to a CT scanner. And all this can be done without any radiation exposure to the surgeon and with minimal dose to the patient.”
The Royal Philips augmented-reality surgical navigation technology is designed to help surgeons perform image-guided open and minimally invasive spine surgery in a hybrid operating room (OR). Due to the inherently reduced visibility of the spine and other structures during such procedures, surgeons rely on real-time imaging and navigation solutions in order to guide their surgical tools and implants. The same is true for minimally invasive cranial surgery and surgery on complex trauma fractures.
The new Philips technology adds additional capabilities to the company’s low-dose X-ray system, using high-resolution optical cameras mounted on the X-ray flat panel detector (FPD) to image the surface of the patient. It then combines the external view captured by the cameras and the internal 3D view of acquired by the X-ray system to construct an augmented-reality view of the patient’s external and internal anatomy, improving procedural planning, surgical tool navigation, and implant accuracy, as well as reducing procedure times.
The first pre-clinical study on the technology was undertaken as a collaboration between Philips, Karolinska University Hospital, Cincinnati Children's Hospital Medical Center, and eight other medical centers. The results showed the technology to be significantly better with respect to overall accuracy (85%), when compared to pedicle screw placement without the technology (64%). The study was published in the November 2016 issue of Spine.
“This new technology allows us to intraoperatively make a high-resolution 3D image of the patient’s spine, plan the optimal device path, and subsequently place pedicle screws using the system’s fully-automatic augmented-reality navigation,” said study co-author Stefan Skúlason, MD, of Landspitali University Hospital (Reykjavik, Iceland). “We can also check the overall result in 3D in the OR without the need to move the patient to a CT scanner. And all this can be done without any radiation exposure to the surgeon and with minimal dose to the patient.”
Latest Radiography News
- Novel Breast Imaging System Proves As Effective As Mammography
- AI Assistance Improves Breast-Cancer Screening by Reducing False Positives
- AI Could Boost Clinical Adoption of Chest DDR
- 3D Mammography Almost Halves Breast Cancer Incidence between Two Screening Tests
- AI Model Predicts 5-Year Breast Cancer Risk from Mammograms
- Deep Learning Framework Detects Fractures in X-Ray Images With 99% Accuracy
- Direct AI-Based Medical X-Ray Imaging System a Paradigm-Shift from Conventional DR and CT
- Chest X-Ray AI Solution Automatically Identifies, Categorizes and Highlights Suspicious Areas
- AI Diagnoses Wrist Fractures As Well As Radiologists
- Annual Mammography Beginning At 40 Cuts Breast Cancer Mortality By 42%
- 3D Human GPS Powered By Light Paves Way for Radiation-Free Minimally-Invasive Surgery
- Novel AI Technology to Revolutionize Cancer Detection in Dense Breasts
- AI Solution Provides Radiologists with 'Second Pair' Of Eyes to Detect Breast Cancers
- AI Helps General Radiologists Achieve Specialist-Level Performance in Interpreting Mammograms
- Novel Imaging Technique Could Transform Breast Cancer Detection
- Computer Program Combines AI and Heat-Imaging Technology for Early Breast Cancer Detection