Manual annotation of 3D point clouds is essential for creating high-quality datasets used in training machine learning models for semantic classification. Despite the development of various annotation tools (ranging from research prototypes to commercial platforms) their usability, functionality, and availability vary greatly depending on users’ technical expertise and the intended application. This study presents a comparative evaluation of manual point cloud annotation tools, focusing on their effectiveness for users with limited Geomatics experience, such as architecture and urban planning professionals. The research encompasses both literature-based and market-driven analyses to identify prevalent tools, including opensource, commercial, and web-based solutions. Eight selected platforms (CloudCompare, QGIS, ArcGIS Pro, Autodesk ReCap Pro, Leica Cyclone 3DReshaper, GreenValley LiDAR360, TerraScan, and Pointly) were tested on two case studies: an indoor university office and an outdoor urban area in Mantova, Italy. Tools were assessed considering usability, interface design, supported formats, classification capabilities, and required user expertise. Results highlight differences in usability and performance. The study concludes that tool selection should align with user expertise, project scale, and environmental complexity. Findings aim to support informed software choices for professionals in built heritage, architecture, and urban studies requiring reliable manual point cloud annotation solutions.

Comparative Assessment of Point Cloud Annotation Workflows for Applications in Architectural and Spatial Studies

Treccani, Daniele;Adami, Andrea
2025-01-01

Abstract

Manual annotation of 3D point clouds is essential for creating high-quality datasets used in training machine learning models for semantic classification. Despite the development of various annotation tools (ranging from research prototypes to commercial platforms) their usability, functionality, and availability vary greatly depending on users’ technical expertise and the intended application. This study presents a comparative evaluation of manual point cloud annotation tools, focusing on their effectiveness for users with limited Geomatics experience, such as architecture and urban planning professionals. The research encompasses both literature-based and market-driven analyses to identify prevalent tools, including opensource, commercial, and web-based solutions. Eight selected platforms (CloudCompare, QGIS, ArcGIS Pro, Autodesk ReCap Pro, Leica Cyclone 3DReshaper, GreenValley LiDAR360, TerraScan, and Pointly) were tested on two case studies: an indoor university office and an outdoor urban area in Mantova, Italy. Tools were assessed considering usability, interface design, supported formats, classification capabilities, and required user expertise. Results highlight differences in usability and performance. The study concludes that tool selection should align with user expertise, project scale, and environmental complexity. Findings aim to support informed software choices for professionals in built heritage, architecture, and urban studies requiring reliable manual point cloud annotation solutions.
2025
30th CIPA Symposium “Heritage Conservation from Bits: From Digital Documentation to Data-driven Heritage Conservation”
Point Cloud Annotation, Manual Classification, Ground Truth, Semantic Segmentation, Software Comparison
File in questo prodotto:
File Dimensione Formato  
isprs-archives-XLVIII-M-9-2025-997-2025.pdf

accesso aperto

: Publisher’s version
Dimensione 2.53 MB
Formato Adobe PDF
2.53 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1297790
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact