Artificial neural networks (ANNs) are universal function approximators, therefore suitable to be trained as predictors of oscillatory time series. Though several ANN architectures have been tested to predict both synthetic and real-world time series, the universality of their predictive power remained unexplored. Here we empirically test this universality across five well-known chaotic oscillators, limiting the analysis to the simplest architecture, namely multi-layer feed-forward ANN trained to predict one sampling step ahead. To compare different predictors, data are sampled according to their frequency content and the ANN structure scales with the characteristic dimensions of the oscillator. Moreover, the quality of recursive multi-step-ahead predictions are compared in terms of the system’s (largest) Lyapunov exponent (LLE), i.e., the predictive power is measured in terms of the number of Lyapunov times (LT, the LLE inverse) predicted within a prescribed (relative) error. The results confirm the rather uniform predictive power of the proposed ANN architecture.

An empirical assessment of the universality of ANNs to predict oscillatory time series

F. Dercole;M. Sangiorgio;Y. Schmirander
2020-01-01

Abstract

Artificial neural networks (ANNs) are universal function approximators, therefore suitable to be trained as predictors of oscillatory time series. Though several ANN architectures have been tested to predict both synthetic and real-world time series, the universality of their predictive power remained unexplored. Here we empirically test this universality across five well-known chaotic oscillators, limiting the analysis to the simplest architecture, namely multi-layer feed-forward ANN trained to predict one sampling step ahead. To compare different predictors, data are sampled according to their frequency content and the ANN structure scales with the characteristic dimensions of the oscillator. Moreover, the quality of recursive multi-step-ahead predictions are compared in terms of the system’s (largest) Lyapunov exponent (LLE), i.e., the predictive power is measured in terms of the number of Lyapunov times (LT, the LLE inverse) predicted within a prescribed (relative) error. The results confirm the rather uniform predictive power of the proposed ANN architecture.
2020
21th IFAC World Congress
Machine learning, Time series modelling, Nonlinear system identification
File in questo prodotto:
File Dimensione Formato  
C4_DercoleSangiorgioSchmirander_IFAC2020.pdf

accesso aperto

: Publisher’s version
Dimensione 2.11 MB
Formato Adobe PDF
2.11 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1169161
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 4
social impact