The integration of collaborative robots into industrial environments has improved productivity but also highlights challenges in operator safety and ergonomics. This paper presents an integrated framework that combines advanced visual perception, continuous ergonomic monitoring, and adaptive Behaviour Tree (BT) decision-making. We adopt a supervisory human-robot collaboration paradigm in which the robot provides temporary, ergonomics-driven assistance only when real-time OWAS assessment indicates hazardous conditions (classes 3-4) and returns execution to the operator as risk subsides, preserving human primacy (97.4% human-led operations in our study). Our modular, scalable approach synthesizes deep learning models, advanced tracking, and dynamic ergonomic assessment. Experimental validation in controlled laboratory settings with industrial-grade sensing and simulation-in-the-loop actuation demonstrates strong performance across multiple dimensions: the perception module achieves 72.4% mAP@50:95; grasp-intention recognition reaches 92.5%; ergonomic risks are classified with 0.081 s mean pose-monitoring latency (95% CI [0.072, 0.093]); and BT policies trigger robotic interventions with 0.07 s decision-layer latency (tick-to-trigger)—approximately 56% faster than a representative prior HRC controller under comparable tasks—while the integrated end-to-end response averages 0.452 s (95% CI [0.283, 0.622]) while maintaining auditable, deterministic safety logic. This comprehensive solution provides a robust platform for enhancing human-robot collaboration in industrial environments by prioritizing ergonomic safety, operational efficiency, and real-time adaptability.
Intelligent Framework for Human-Robot Collaboration: Dynamic Ergonomics and Adaptive Decision-Making
Iodice, Francesco;Momi, Elena;Ajoudani, Arash
2026-01-01
Abstract
The integration of collaborative robots into industrial environments has improved productivity but also highlights challenges in operator safety and ergonomics. This paper presents an integrated framework that combines advanced visual perception, continuous ergonomic monitoring, and adaptive Behaviour Tree (BT) decision-making. We adopt a supervisory human-robot collaboration paradigm in which the robot provides temporary, ergonomics-driven assistance only when real-time OWAS assessment indicates hazardous conditions (classes 3-4) and returns execution to the operator as risk subsides, preserving human primacy (97.4% human-led operations in our study). Our modular, scalable approach synthesizes deep learning models, advanced tracking, and dynamic ergonomic assessment. Experimental validation in controlled laboratory settings with industrial-grade sensing and simulation-in-the-loop actuation demonstrates strong performance across multiple dimensions: the perception module achieves 72.4% mAP@50:95; grasp-intention recognition reaches 92.5%; ergonomic risks are classified with 0.081 s mean pose-monitoring latency (95% CI [0.072, 0.093]); and BT policies trigger robotic interventions with 0.07 s decision-layer latency (tick-to-trigger)—approximately 56% faster than a representative prior HRC controller under comparable tasks—while the integrated end-to-end response averages 0.452 s (95% CI [0.283, 0.622]) while maintaining auditable, deterministic safety logic. This comprehensive solution provides a robust platform for enhancing human-robot collaboration in industrial environments by prioritizing ergonomic safety, operational efficiency, and real-time adaptability.| File | Dimensione | Formato | |
|---|---|---|---|
|
Intelligent Framework for Human-Robot Collaboration.pdf
accesso aperto
:
Publisher’s version
Dimensione
1.35 MB
Formato
Adobe PDF
|
1.35 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


