The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm.

Interactive and immersive image-guided control of interventional manipulators with a prototype holographic interface

Tsiamyrtzis P.
2019-01-01

Abstract

The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm.
2019
Proceedings - 2019 IEEE 19th International Conference on Bioinformatics and Bioengineering, BIBE 2019
978-1-7281-4617-1
Augmented reality; Image guided interventions; Robot-assistance
File in questo prodotto:
File Dimensione Formato  
Interactive-and-immersive-imageguided-control-of-interventional-manipulators-with-a-prototype-holographic-interface2019.pdf

Accesso riservato

: Publisher’s version
Dimensione 361.17 kB
Formato Adobe PDF
361.17 kB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1130254
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 2
social impact