BackgroundMRI-guidance techniques that dynamically adapt radiation beams to follow tumor motion in real time will lead to more accurate cancer treatments and reduced collateral healthy tissue damage. The gold-standard for reconstruction of undersampled MR data is compressed sensing (CS) which is computationally slow and limits the rate that images can be available for real-time adaptation. PurposeOnce trained, neural networks can be used to accurately reconstruct raw MRI data with minimal latency. Here, we test the suitability of deep-learning-based image reconstruction for real-time tracking applications on MRI-Linacs. MethodsWe use automated transform by manifold approximation (AUTOMAP), a generalized framework that maps raw MR signal to the target image domain, to rapidly reconstruct images from undersampled radial k-space data. The AUTOMAP neural network was trained to reconstruct images from a golden-angle radial acquisition, a benchmark for motion-sensitive imaging, on lung cancer patient data and generic images from ImageNet. Model training was subsequently augmented with motion-encoded k-space data derived from videos in the YouTube-8M dataset to encourage motion robust reconstruction. ResultsAUTOMAP models fine-tuned on retrospectively acquired lung cancer patient data reconstructed radial k-space with equivalent accuracy to CS but with much shorter processing times. Validation of motion-trained models with a virtual dynamic lung tumor phantom showed that the generalized motion properties learned from YouTube lead to improved target tracking accuracy. ConclusionAUTOMAP can achieve real-time, accurate reconstruction of radial data. These findings imply that neural-network-based reconstruction is potentially superior to alternative approaches for real-time image guidance applications.

Real‐time radial reconstruction with domain transform manifold learning for MRI‐guided radiotherapy

Paganelli, Chiara;
2023-01-01

Abstract

BackgroundMRI-guidance techniques that dynamically adapt radiation beams to follow tumor motion in real time will lead to more accurate cancer treatments and reduced collateral healthy tissue damage. The gold-standard for reconstruction of undersampled MR data is compressed sensing (CS) which is computationally slow and limits the rate that images can be available for real-time adaptation. PurposeOnce trained, neural networks can be used to accurately reconstruct raw MRI data with minimal latency. Here, we test the suitability of deep-learning-based image reconstruction for real-time tracking applications on MRI-Linacs. MethodsWe use automated transform by manifold approximation (AUTOMAP), a generalized framework that maps raw MR signal to the target image domain, to rapidly reconstruct images from undersampled radial k-space data. The AUTOMAP neural network was trained to reconstruct images from a golden-angle radial acquisition, a benchmark for motion-sensitive imaging, on lung cancer patient data and generic images from ImageNet. Model training was subsequently augmented with motion-encoded k-space data derived from videos in the YouTube-8M dataset to encourage motion robust reconstruction. ResultsAUTOMAP models fine-tuned on retrospectively acquired lung cancer patient data reconstructed radial k-space with equivalent accuracy to CS but with much shorter processing times. Validation of motion-trained models with a virtual dynamic lung tumor phantom showed that the generalized motion properties learned from YouTube lead to improved target tracking accuracy. ConclusionAUTOMAP can achieve real-time, accurate reconstruction of radial data. These findings imply that neural-network-based reconstruction is potentially superior to alternative approaches for real-time image guidance applications.
2023
MRI
deep learning
radiotherapy
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1259648
Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact