Exoscopes have emerged as a promising visual solution within the field of microneurosurgery. However, manual repositioning poses a challenge causing interruptions that disrupt the surgical flow. Thus, the need for hands-free exoscope control arises. This letter introduces a position-based visual-servoing control approach, comprising a detection module, a hybrid tracking module, and a control module that adjusts a robotic camera holder to follow a surgical tool. The hybrid module was integrated to track and predict the surgical tool's future position to minimize system latency. The proposed system is composed of a 7 Degree-of-Freedom robotic manipulator with an eye-in-hand stereo camera. A comparative analysis with three alternative approaches (Convolutional Neural Network - CNN, Particle Filter - PF, Optical Flow - OF) was assessed using Tracking Error and Center Error metrics. Results showed improved tracking accuracy with an average error of 9.84 pm 0.08 mm for slow movements (2.5 cm/s) and 13.11 pm 0.39 mm for rapid movements (4 cm/s). Finally, a User Study was conducted to investigate whether the proposed system effectively reduced the users' workload compared to the manual repositioning of the camera.

Hybrid Tracking Module for Real-Time Tool Tracking for an Autonomous Exoscope

E. Iovene;E. De Momi
2024-01-01

Abstract

Exoscopes have emerged as a promising visual solution within the field of microneurosurgery. However, manual repositioning poses a challenge causing interruptions that disrupt the surgical flow. Thus, the need for hands-free exoscope control arises. This letter introduces a position-based visual-servoing control approach, comprising a detection module, a hybrid tracking module, and a control module that adjusts a robotic camera holder to follow a surgical tool. The hybrid module was integrated to track and predict the surgical tool's future position to minimize system latency. The proposed system is composed of a 7 Degree-of-Freedom robotic manipulator with an eye-in-hand stereo camera. A comparative analysis with three alternative approaches (Convolutional Neural Network - CNN, Particle Filter - PF, Optical Flow - OF) was assessed using Tracking Error and Center Error metrics. Results showed improved tracking accuracy with an average error of 9.84 pm 0.08 mm for slow movements (2.5 cm/s) and 13.11 pm 0.39 mm for rapid movements (4 cm/s). Finally, a User Study was conducted to investigate whether the proposed system effectively reduced the users' workload compared to the manual repositioning of the camera.
2024
File in questo prodotto:
File Dimensione Formato  
Hybrid_Tracking_Module_for_Real-Time_Tool_Tracking_for_an_Autonomous_Exoscope.pdf

accesso aperto

: Pre-Print (o Pre-Refereeing)
Dimensione 1.54 MB
Formato Adobe PDF
1.54 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1265704
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 4
social impact