Purpose: Nowadays, with the increased diffusion of Cone Beam Computerized Tomography (CBCT) scanners in dental and maxillo-facial practice, 3D cephalometric analysis is emerging. Maxillofacial surgeons and dentists make wide use of cephalometric analysis in diagnosis, surgery and treatment planning. Accuracy and repeatability of the manual approach, the most common approach in clinical practice, are limited by intra- and inter-subject variability in landmark identification. So, we propose a computer-aided landmark annotation approach that estimates the three-dimensional (3D) positions of 21 selected landmarks. Methods: The procedure involves an adaptive cluster-based segmentation of bone tissues followed by an intensity-based registration of an annotated reference volume onto a patient Cone Beam CT (CBCT) head volume. The outcomes of the annotation process are presented to the clinician as a 3D surface of the patient skull with the estimate landmark displayed on it. Moreover, each landmark is centered into a spherical confidence region that can help the clinician in a subsequent manual refinement of the annotation. The algorithm was validated onto 18 CBCT images. Results: Automatic segmentation shows a high accuracy level with no significant difference between automatically and manually determined threshold values. The overall median value of the localization error was equal to 1.99 mm with an interquartile range (IQR) of 1.22–2.89 mm. Conclusion: The obtained results are promising, segmentation was proved to be very robust and the achieved accuracy level in landmark annotation was acceptable for most of landmarks and comparable with other available methods.

Computer-aided cephalometric landmark annotation for CBCT data

Codari, Marina;Caffini, Matteo;Baselli, Giuseppe
2017-01-01

Abstract

Purpose: Nowadays, with the increased diffusion of Cone Beam Computerized Tomography (CBCT) scanners in dental and maxillo-facial practice, 3D cephalometric analysis is emerging. Maxillofacial surgeons and dentists make wide use of cephalometric analysis in diagnosis, surgery and treatment planning. Accuracy and repeatability of the manual approach, the most common approach in clinical practice, are limited by intra- and inter-subject variability in landmark identification. So, we propose a computer-aided landmark annotation approach that estimates the three-dimensional (3D) positions of 21 selected landmarks. Methods: The procedure involves an adaptive cluster-based segmentation of bone tissues followed by an intensity-based registration of an annotated reference volume onto a patient Cone Beam CT (CBCT) head volume. The outcomes of the annotation process are presented to the clinician as a 3D surface of the patient skull with the estimate landmark displayed on it. Moreover, each landmark is centered into a spherical confidence region that can help the clinician in a subsequent manual refinement of the annotation. The algorithm was validated onto 18 CBCT images. Results: Automatic segmentation shows a high accuracy level with no significant difference between automatically and manually determined threshold values. The overall median value of the localization error was equal to 1.99 mm with an interquartile range (IQR) of 1.22–2.89 mm. Conclusion: The obtained results are promising, segmentation was proved to be very robust and the achieved accuracy level in landmark annotation was acceptable for most of landmarks and comparable with other available methods.
2017
Cephalometry; Cone beam CT; Image registration; Image segmentation; Anatomic Landmarks; Cephalometry; Cone-Beam Computed Tomography; Humans; Image Processing, Computer-Assisted; Imaging, Three-Dimensional; Reproducibility of Results; Skull; Algorithms; Surgery; Radiology, Nuclear Medicine and Imaging; Health Informatics
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1039506
Citazioni
  • ???jsp.display-item.citation.pmc??? 17
  • Scopus 49
  • ???jsp.display-item.citation.isi??? 36
social impact