Photoplethysmography (PPG) is a widely used noninvasive optical technique for assessing various cardiovascular parameters in both clinical and real-world settings. Despite its popularity thanks to wearable devices, PPG signal is prone to a variety of artifacts, including motion, ambient light interference, and sensor detachment. To overcome this issue, this study proposed two new convolutional neural network architectures, CNN-PPG and CNN-PPG+A, designed to perform automatic PPG artifact detection. While CNN-PPG focused exclusively on the PPG data, CNN-PPG+A dealt with additional information from acceleration signals. Both models were trained on a dataset of 110 robot-assisted gait rehabilitation activity sessions from 46 subjects (mean age (SD) 15.5 (14.2); 30 males) affected by cerebral palsy, acquired brain injury or hereditary spastic paraplegia. The model performance was evaluated on signal instances of 2, 3, and 5 seconds. The results demonstrated that both models were able to accurately identify artifact-corrupted PPG instances, with the best performance achieved on 5-seconds signal instances, resulting in an F2 score of 0.8. Although CNNPPG+A introduced greater complexity, it did not consistently outperform CNN-PPG, suggesting potential redundancies in the extracted features from the acceleration data, or absence of correlation between motion and noise for artifacts driven by sensor decoupling with skin. The study highlights the challenges faced in PPG artifact detection in high-motion clinical scenarios and contributes to the improvement of PPG-based monitoring systems, emphasizing the need for tailored approaches in clinical settings.

Automatic detection of artifacts in photoplethysmography signals through convolutional neural networks during robot-assisted gait rehabilitation

Costantini S.;Biffi E.;Storm F. A.;Andreoni G.;Bianchi A. M.
2024-01-01

Abstract

Photoplethysmography (PPG) is a widely used noninvasive optical technique for assessing various cardiovascular parameters in both clinical and real-world settings. Despite its popularity thanks to wearable devices, PPG signal is prone to a variety of artifacts, including motion, ambient light interference, and sensor detachment. To overcome this issue, this study proposed two new convolutional neural network architectures, CNN-PPG and CNN-PPG+A, designed to perform automatic PPG artifact detection. While CNN-PPG focused exclusively on the PPG data, CNN-PPG+A dealt with additional information from acceleration signals. Both models were trained on a dataset of 110 robot-assisted gait rehabilitation activity sessions from 46 subjects (mean age (SD) 15.5 (14.2); 30 males) affected by cerebral palsy, acquired brain injury or hereditary spastic paraplegia. The model performance was evaluated on signal instances of 2, 3, and 5 seconds. The results demonstrated that both models were able to accurately identify artifact-corrupted PPG instances, with the best performance achieved on 5-seconds signal instances, resulting in an F2 score of 0.8. Although CNNPPG+A introduced greater complexity, it did not consistently outperform CNN-PPG, suggesting potential redundancies in the extracted features from the acceleration data, or absence of correlation between motion and noise for artifacts driven by sensor decoupling with skin. The study highlights the challenges faced in PPG artifact detection in high-motion clinical scenarios and contributes to the improvement of PPG-based monitoring systems, emphasizing the need for tailored approaches in clinical settings.
2024
2024 IEEE 8th Forum on Research and Technologies for Society and Industry Innovation (RTSI)
979-8-3503-6213-8
File in questo prodotto:
File Dimensione Formato  
Costantini et al. - 2024 - Automatic Detection of Artifacts in Photoplethysmography Signals Through Convolutional Neural Networ.pdf

accesso aperto

: Publisher’s version
Dimensione 1.47 MB
Formato Adobe PDF
1.47 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1287250
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact