Ultrasound (US)-guided needle-based interventions have been extensively employed in clinical practice for biopsies, regional anesthesia, and tumor ablation, considering the advantages of real-time imaging, portability, and non-ionizing radiation. Precise localization and tracking of the needle tip are critical for ensuring the successful needle placement while minimizing the risk of injury to surrounding anatomical structures. However, achieving real-time and reliable needle tip tracking remains challenging due to the inherent properties of US images, such as low spatial resolution, speckle noise, and imaging artifacts that obscure the needle tip. This study proposes a hybrid learning-based framework for accurately and robustly localizing the needle tip within the US image plane during robot-assisted interventional procedures. The framework comprises a dual-branch template matching-based detection module, a Convolutional Neural Network (CNN)-based detection module, and a “dynamic template selection and updating" mechanism to enhance overall tracking performance. Additionally, to recover the needle tip invisibility caused by unanticipated movements, an active exploration strategy based on a Gaussian Process (GP) model and Bayesian Optimization (BO) is proposed for efficient and accurate robotic US probe pose adjustment. The effectiveness of the proposed method for needle tip tracking performance has been evaluated through a series of experiments conducted on ex vivo porcine liver samples under varying insertion angles and velocities. Experimental results demonstrate that the proposed framework achieves a maximum median localization error of 1.17 mm and an Interquartile Range (IQR) of 1.28 mm across all test conditions. Furthermore, in scenarios requiring needle tip visibility recovery, the proposed BO-based strategy achieved a median localization error of 1.52 mm and an IQR of 0.95 mm.
Learning-Based Hybrid Needle Tip Tracking and Visualization Framework for Robotic Ultrasound-Guided Interventions
Junling Fu;Zijian Cai;Runing Xiao;Giancarlo Ferrigno;Alberto Redaelli;Elena De Momi
2026-01-01
Abstract
Ultrasound (US)-guided needle-based interventions have been extensively employed in clinical practice for biopsies, regional anesthesia, and tumor ablation, considering the advantages of real-time imaging, portability, and non-ionizing radiation. Precise localization and tracking of the needle tip are critical for ensuring the successful needle placement while minimizing the risk of injury to surrounding anatomical structures. However, achieving real-time and reliable needle tip tracking remains challenging due to the inherent properties of US images, such as low spatial resolution, speckle noise, and imaging artifacts that obscure the needle tip. This study proposes a hybrid learning-based framework for accurately and robustly localizing the needle tip within the US image plane during robot-assisted interventional procedures. The framework comprises a dual-branch template matching-based detection module, a Convolutional Neural Network (CNN)-based detection module, and a “dynamic template selection and updating" mechanism to enhance overall tracking performance. Additionally, to recover the needle tip invisibility caused by unanticipated movements, an active exploration strategy based on a Gaussian Process (GP) model and Bayesian Optimization (BO) is proposed for efficient and accurate robotic US probe pose adjustment. The effectiveness of the proposed method for needle tip tracking performance has been evaluated through a series of experiments conducted on ex vivo porcine liver samples under varying insertion angles and velocities. Experimental results demonstrate that the proposed framework achieves a maximum median localization error of 1.17 mm and an Interquartile Range (IQR) of 1.28 mm across all test conditions. Furthermore, in scenarios requiring needle tip visibility recovery, the proposed BO-based strategy achieved a median localization error of 1.52 mm and an IQR of 0.95 mm.| File | Dimensione | Formato | |
|---|---|---|---|
|
IEEE TMRB, Learning-Based Needle Tip Tracking for Robotic US-guided Intervention_5M.pdf
accesso aperto
:
Pre-Print (o Pre-Refereeing)
Dimensione
4.51 MB
Formato
Adobe PDF
|
4.51 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


