Purpose: Surgical workflow recognition and context-aware systems could allow better decision making and surgical planning by providing the focused information, which may eventually enhance surgical outcomes. While current developments in computer-assisted surgical systems are mostly focused on recognizing surgical phases, they lack recognition of surgical workflow sequence and other contextual element, e.g., “Instruments.” Our study proposes a hybrid approach, i.e., using deep learning and knowledge representation, to facilitate recognition of the surgical workflow. Methods: We implemented “Deep-Onto” network, which is an ensemble of deep learning models and knowledge management tools, ontology and production rules. As a prototypical scenario, we chose robot-assisted partial nephrectomy (RAPN). We annotated RAPN videos with surgical entities, e.g., “Step” and so forth. We performed different experiments, including the inter-subject variability, to recognize surgical steps. The corresponding subsequent steps along with other surgical contexts, i.e., “Actions,” “Phase” and “Instruments,” were also recognized. Results: The system was able to recognize 10 RAPN steps with the prevalence-weighted macro-average (PWMA) recall of 0.83, PWMA precision of 0.74, PWMA F1 score of 0.76, and the accuracy of 74.29% on 9 videos of RAPN. Conclusion: We found that the combined use of deep learning and knowledge representation techniques is a promising approach for the multi-level recognition of RAPN surgical workflow.

“Deep-Onto” network for surgical workflow and context recognition

Nakawala, Hirenkumar;Ferrigno, Giancarlo;De Momi, Elena
2018-01-01

Abstract

Purpose: Surgical workflow recognition and context-aware systems could allow better decision making and surgical planning by providing the focused information, which may eventually enhance surgical outcomes. While current developments in computer-assisted surgical systems are mostly focused on recognizing surgical phases, they lack recognition of surgical workflow sequence and other contextual element, e.g., “Instruments.” Our study proposes a hybrid approach, i.e., using deep learning and knowledge representation, to facilitate recognition of the surgical workflow. Methods: We implemented “Deep-Onto” network, which is an ensemble of deep learning models and knowledge management tools, ontology and production rules. As a prototypical scenario, we chose robot-assisted partial nephrectomy (RAPN). We annotated RAPN videos with surgical entities, e.g., “Step” and so forth. We performed different experiments, including the inter-subject variability, to recognize surgical steps. The corresponding subsequent steps along with other surgical contexts, i.e., “Actions,” “Phase” and “Instruments,” were also recognized. Results: The system was able to recognize 10 RAPN steps with the prevalence-weighted macro-average (PWMA) recall of 0.83, PWMA precision of 0.74, PWMA F1 score of 0.76, and the accuracy of 74.29% on 9 videos of RAPN. Conclusion: We found that the combined use of deep learning and knowledge representation techniques is a promising approach for the multi-level recognition of RAPN surgical workflow.
2018
Deep learning; Knowledge representation; Robot-assisted partial nephrectomy; Surgical workflow; Surgery; Biomedical Engineering; Radiology, Nuclear Medicine and Imaging; 1707; Health Informatics; Computer Science Applications1707 Computer Vision and Pattern Recognition; Computer Graphics and Computer-Aided Design
File in questo prodotto:
File Dimensione Formato  
“Deep-Onto” network for surgical workflow and context recognition (1).pdf

Accesso riservato

: Publisher’s version
Dimensione 2.42 MB
Formato Adobe PDF
2.42 MB Adobe PDF   Visualizza/Apri
deep-network-surgical (1).pdf

accesso aperto

: Post-Print (DRAFT o Author’s Accepted Manuscript-AAM)
Dimensione 3.45 MB
Formato Adobe PDF
3.45 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1071405
Citazioni
  • ???jsp.display-item.citation.pmc??? 9
  • Scopus 50
  • ???jsp.display-item.citation.isi??? 39
social impact