Conference Publishing

Braden Thorne

Authors: Braden Thorne, Thomas Jüngling , Michael Small , Debora Correa , and Ayham Zaitouny

2022-12-07

Publication

AI 2022: AI 2022: Advances in Artificial Intelligence pp 442-455

13728

Thorne, B., Jüngling, T., Small, M., Corrêa, D., Zaitouny, A. (2022). A Novel Approach to Time Series Complexity via Reservoir Computing. In: Aziz, H., Corrêa, D., French, T. (eds) AI 2022: Advances in Artificial Intelligence. AI 2022. Lecture Notes in Computer Science(), vol 13728. Springer, Cham. https://doi.org/10.1007/978-3-031-22695-3_31

Quality Indicators

Peer Reviewed

Aziz, H., Corrêa, D., French, T. (eds) AI 2022: Advances in Artificial Intelligence. AI 2022. Lecture Notes in Computer Science(), vol 13728. Springer, Cham. https://doi.org/10.1007/978-3-031-22695-3_31

Relevance to the Centre

When working with time series, it is often beneficial to have an idea as to how complex the signal is. Periodic, chaotic and random signals (from least to most complex) may each be approached in different ways, and knowing when a signal can be identified as belonging to one of these categories can reveal a lot about the underlying system. In the field of time series analysis, permutation entropy has emerged as one of the premier measures of time series complexity due to its ability to be calculated from data alone. We propose an alternative method for calculating complexity based on the machine learning paradigm of reservoir computing, and how the outputs of these neural networks capture similar information regarding signal complexity. We observe similar behaviour in our proposed measure to both the Lyapunov exponent and permutation entropy for well known dynamical systems. Additionally, we assess the dependence of our measure on key hyperparameters of the model, drawing conclusions about the invariance of the measure and possible implications on informing network structure

DOI: 10.1007/978-3-031-22695-3_31

Link to Publication