Objective: This study aims to understand breathing patterns during daily activities by developing a wearable respiratory and activity monitoring (WRAM) system. Methods: A novel multimodal fusion architecture is proposed to calculate the respiratory and exercise parameters and simultaneously identify human actions. A hybrid hierarchical classification (HHC) algorithm combining deep learning and threshold-based methods is presented to distinguish 15 complex activities for accuracy enhancement and fast computation. A series of signal processing algorithms are utilized and integrated to calculate breathing and motion indices. The designed wireless communication structure achieves the interactions among chest bands, mobile devices, and the data processing center. Results: The advantage of the proposed HHC method is evaluated by comparing the average accuracy (97.22%) and predictive time (0.0094 s) with machine learning and deep learning approaches. The nine breathing patterns during 15 activities were analyzed by investigating the data from 12 subjects. With 12 hours of naturalistic data collected from one participant, the WRAM system reports the breathing and exercise performance within the identified motions. The demonstration shows the ability of the WRAM system to monitor multiple users breathing and exercise status in real-time. Conclusion: The present system demonstrates the usefulness of the framework of breathing pattern monitoring during daily activities, which may be potentially used in healthcare. Significance: The proposed multimodal based WRAM system offers new insights into the breathing function of exercise in action and presents a novel approach for precision medicine and health state monitoring.

A multimodal wearable system for continuous and real-time breathing pattern monitoring during daily activity

Qi W.;Aliverti A.
2020-01-01

Abstract

Objective: This study aims to understand breathing patterns during daily activities by developing a wearable respiratory and activity monitoring (WRAM) system. Methods: A novel multimodal fusion architecture is proposed to calculate the respiratory and exercise parameters and simultaneously identify human actions. A hybrid hierarchical classification (HHC) algorithm combining deep learning and threshold-based methods is presented to distinguish 15 complex activities for accuracy enhancement and fast computation. A series of signal processing algorithms are utilized and integrated to calculate breathing and motion indices. The designed wireless communication structure achieves the interactions among chest bands, mobile devices, and the data processing center. Results: The advantage of the proposed HHC method is evaluated by comparing the average accuracy (97.22%) and predictive time (0.0094 s) with machine learning and deep learning approaches. The nine breathing patterns during 15 activities were analyzed by investigating the data from 12 subjects. With 12 hours of naturalistic data collected from one participant, the WRAM system reports the breathing and exercise performance within the identified motions. The demonstration shows the ability of the WRAM system to monitor multiple users breathing and exercise status in real-time. Conclusion: The present system demonstrates the usefulness of the framework of breathing pattern monitoring during daily activities, which may be potentially used in healthcare. Significance: The proposed multimodal based WRAM system offers new insights into the breathing function of exercise in action and presents a novel approach for precision medicine and health state monitoring.
2020
deep learning
human activity recognition
Multimodal data fusion
signal processing
wearable devices
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1170119
Citazioni
  • ???jsp.display-item.citation.pmc??? 22
  • Scopus 92
  • ???jsp.display-item.citation.isi??? 85
social impact