Purpose: Breathing parameters change with activity and posture, but currently available solutions can perform measurements only during static conditions. Methods: This article presents an innovative wearable sensor system constituted by three inertial measurement units to simultaneously estimate respiratory rate (RR) in static and dynamic conditions and perform human activity recognition (HAR) with the same sensing principle. Two units are aimed at detecting chest wall breathing-related movements (one on the thorax, one on the abdomen); the third is on the lower back. All units compute the quaternions describing the subject’s movement and send data continuously with the ANT transmission protocol to an app. The 20 healthy subjects involved in the research (9 men, 11 women) were between 23 and 54 years old, with mean age 26.8, mean height 172.5 cm and mean weight 66.9 kg. Data from these subjects during different postures or activities were collected and analyzed to extract RR. Results: Statistically significant differences between dynamic activities (“walking slow”, “walking fast”, “running” and “cycling”) and static postures were detected (p < 0.05), confirming the obtained measurements are in line with physiology even during dynamic activities. Data from the reference unit only and from all three units were used as inputs to artificial intelligence methods for HAR. When the data from the reference unit were used, the Gated Recurrent Unit was the best performing method (97% accuracy). With three units, a 1D Convolutional Neural Network was the best performing (99% accuracy). Conclusion: Overall, the proposed solution shows it is possible to perform simultaneous HAR and RR measurements in static and dynamic conditions with the same sensor system.

An IMU-Based Wearable System for Respiratory Rate Estimation in Static and Dynamic Conditions

Angelucci A.;Aliverti A.
2023-01-01

Abstract

Purpose: Breathing parameters change with activity and posture, but currently available solutions can perform measurements only during static conditions. Methods: This article presents an innovative wearable sensor system constituted by three inertial measurement units to simultaneously estimate respiratory rate (RR) in static and dynamic conditions and perform human activity recognition (HAR) with the same sensing principle. Two units are aimed at detecting chest wall breathing-related movements (one on the thorax, one on the abdomen); the third is on the lower back. All units compute the quaternions describing the subject’s movement and send data continuously with the ANT transmission protocol to an app. The 20 healthy subjects involved in the research (9 men, 11 women) were between 23 and 54 years old, with mean age 26.8, mean height 172.5 cm and mean weight 66.9 kg. Data from these subjects during different postures or activities were collected and analyzed to extract RR. Results: Statistically significant differences between dynamic activities (“walking slow”, “walking fast”, “running” and “cycling”) and static postures were detected (p < 0.05), confirming the obtained measurements are in line with physiology even during dynamic activities. Data from the reference unit only and from all three units were used as inputs to artificial intelligence methods for HAR. When the data from the reference unit were used, the Gated Recurrent Unit was the best performing method (97% accuracy). With three units, a 1D Convolutional Neural Network was the best performing (99% accuracy). Conclusion: Overall, the proposed solution shows it is possible to perform simultaneous HAR and RR measurements in static and dynamic conditions with the same sensor system.
2023
e-Health
Human activity recognition
Internet of medical things
Respiratory monitoring
Telemedicine
File in questo prodotto:
File Dimensione Formato  
Angelucci-IMUBasedWearableSystem-CVET-2023.pdf

accesso aperto

Dimensione 1.39 MB
Formato Adobe PDF
1.39 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1250884
Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 0
social impact