With the continuous improvement of the Internet of Things and sensing technology, it is becoming easier to obtain multi-source information from different types of sensors. In multi-source information fusion (MSIF), three types of fusion, including data-level, feature-level, and decision-level are crucial for bridging the multi-source information and final fault diagnosis. However, most existing intelligent fault diagnosis (IFD)-based methods use only a single type and ignore the other two types, which makes high-quality features contained in the multi-source information extracted by the deep learning methods incomplete. Meanwhile, IFD under limited and imbalanced (L&I) datasets is a challenging and important issue for practical industrial applications owing to the complex and harsh operating conditions. Most importantly, the current MSIF-based methods still have defects in three types of fusion. To tackle these issues simultaneously, a multi-level fusion network based on multi-source information (MLFNet) is presented to enable three types of fusion in a unified deep network and achieving fault diagnosis under L&I conditions. In the data-level, a signal preprocessing technique is first employed to obtain RGB images from multi-source signals. In the feature-level, dynamic multiscale feature extraction (DMFE) is built to select the appropriate feature scale and hierarchical feature fusion network (HFFN) is proposed to mine the shallow and deep features, which not only solves the unreliability problem of fixed feature scale but also acquires more representative and complementary features. In the decision-level, dual classifiers are used to produce more accurate diagnostic results under L&I datasets. Extensive experiments conducted on four datasets show the validity and dominance of our proposed method for fault diagnosis under L&I conditions, which outperforms state-of-the-art (SOTA) methods in this article.
MLFNet: A novel multi-level fusion mechanical fault diagnosis network under limited and imbalanced datasets using multi-source information
Karimi, Hamid Reza;
2026-01-01
Abstract
With the continuous improvement of the Internet of Things and sensing technology, it is becoming easier to obtain multi-source information from different types of sensors. In multi-source information fusion (MSIF), three types of fusion, including data-level, feature-level, and decision-level are crucial for bridging the multi-source information and final fault diagnosis. However, most existing intelligent fault diagnosis (IFD)-based methods use only a single type and ignore the other two types, which makes high-quality features contained in the multi-source information extracted by the deep learning methods incomplete. Meanwhile, IFD under limited and imbalanced (L&I) datasets is a challenging and important issue for practical industrial applications owing to the complex and harsh operating conditions. Most importantly, the current MSIF-based methods still have defects in three types of fusion. To tackle these issues simultaneously, a multi-level fusion network based on multi-source information (MLFNet) is presented to enable three types of fusion in a unified deep network and achieving fault diagnosis under L&I conditions. In the data-level, a signal preprocessing technique is first employed to obtain RGB images from multi-source signals. In the feature-level, dynamic multiscale feature extraction (DMFE) is built to select the appropriate feature scale and hierarchical feature fusion network (HFFN) is proposed to mine the shallow and deep features, which not only solves the unreliability problem of fixed feature scale but also acquires more representative and complementary features. In the decision-level, dual classifiers are used to produce more accurate diagnostic results under L&I datasets. Extensive experiments conducted on four datasets show the validity and dominance of our proposed method for fault diagnosis under L&I conditions, which outperforms state-of-the-art (SOTA) methods in this article.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


