The Cardiotocography (CTG) is a widely diffused monitoring practice, used in Ob-Gyn Clinic to assess the fetal well-being through the analysis of the Fetal Heart Rate (FHR) and the Uterine contraction signals. Due to the complex dynamics regulating the Fetal Heart Rate, a reliable visual interpretation of the signal is almost impossible and results in significant subjective inter and intra-observer variability. Also, the introduction of few parameters obtained from computer analysis did not solve the problem of a robust antenatal diagnosis. Hence, during the last decade, computer aided diagnosis systems, based on artificial intelligence (AI) machine learning techniques have been developed to assist medical decisions. The present work proposes a hybrid approach based on a neural architecture that receives heterogeneous data in input (a set of quantitative parameters and images) for classifying healthy and pathological fetuses. The quantitative regressors, which are known to represent different aspects of the correct development of the fetus, and thus are related to the fetal healthy status, are combined with features implicitly extracted from various representations of the FHR signal (images), in order to improve the classification performance. This is achieved by setting a neural model with two connected branches, consisting respectively of a Multi-Layer Perceptron (MLP) and a Convolutional Neural Network (CNN). The neural architecture was trained on a huge and balanced set of clinical data (14.000 CTG tracings, 7000 healthy and 7000 pathological) recorded during ambulatory non stress tests at the University Hospital Federico II, Napoli, Italy. After hyperparameters tuning and training, the neural network proposed has reached an overall accuracy of 80.1%, which is a promising result, as it has been obtained on a huge dataset.
A deep learning mixed-data type approach for the classification of FHR signals
Daniele, Beniamino;Signorini, Maria Gabriella;
2022-01-01
Abstract
The Cardiotocography (CTG) is a widely diffused monitoring practice, used in Ob-Gyn Clinic to assess the fetal well-being through the analysis of the Fetal Heart Rate (FHR) and the Uterine contraction signals. Due to the complex dynamics regulating the Fetal Heart Rate, a reliable visual interpretation of the signal is almost impossible and results in significant subjective inter and intra-observer variability. Also, the introduction of few parameters obtained from computer analysis did not solve the problem of a robust antenatal diagnosis. Hence, during the last decade, computer aided diagnosis systems, based on artificial intelligence (AI) machine learning techniques have been developed to assist medical decisions. The present work proposes a hybrid approach based on a neural architecture that receives heterogeneous data in input (a set of quantitative parameters and images) for classifying healthy and pathological fetuses. The quantitative regressors, which are known to represent different aspects of the correct development of the fetus, and thus are related to the fetal healthy status, are combined with features implicitly extracted from various representations of the FHR signal (images), in order to improve the classification performance. This is achieved by setting a neural model with two connected branches, consisting respectively of a Multi-Layer Perceptron (MLP) and a Convolutional Neural Network (CNN). The neural architecture was trained on a huge and balanced set of clinical data (14.000 CTG tracings, 7000 healthy and 7000 pathological) recorded during ambulatory non stress tests at the University Hospital Federico II, Napoli, Italy. After hyperparameters tuning and training, the neural network proposed has reached an overall accuracy of 80.1%, which is a promising result, as it has been obtained on a huge dataset.File | Dimensione | Formato | |
---|---|---|---|
fbioe-10-887549.pdf
accesso aperto
Descrizione: final published paper
:
Publisher’s version
Dimensione
1.42 MB
Formato
Adobe PDF
|
1.42 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.