TY - JOUR
T1 - Input-to-state representation in linear reservoirs dynamics
AU - Verzelli, Pietro
AU - Alippi, Cesare
AU - Livi, Lorenzo
AU - Tino, Peter
PY - 2021/3/2
Y1 - 2021/3/2
N2 - Reservoir computing is a popular approach to design recurrent neural networks, due to its training simplicity and approximation performance. The recurrent part of these networks is not trained (e.g., via gradient descent), making them appealing for analytical studies by a large community of researchers with backgrounds spanning from dynamical systems to neuroscience. However, even in the simple linear case, the working principle of these networks is not fully understood and their design is usually driven by heuristics. A novel analysis of the dynamics of such networks is proposed, which allows the investigator to express the state evolution using the controllability matrix. Such a matrix encodes salient characteristics of the network dynamics; in particular, its rank represents an input-independent measure of the memory capacity of the network. Using the proposed approach, it is possible to compare different reservoir architectures and explain why a cyclic topology achieves favorable results as verified by practitioners.
AB - Reservoir computing is a popular approach to design recurrent neural networks, due to its training simplicity and approximation performance. The recurrent part of these networks is not trained (e.g., via gradient descent), making them appealing for analytical studies by a large community of researchers with backgrounds spanning from dynamical systems to neuroscience. However, even in the simple linear case, the working principle of these networks is not fully understood and their design is usually driven by heuristics. A novel analysis of the dynamics of such networks is proposed, which allows the investigator to express the state evolution using the controllability matrix. Such a matrix encodes salient characteristics of the network dynamics; in particular, its rank represents an input-independent measure of the memory capacity of the network. Using the proposed approach, it is possible to compare different reservoir architectures and explain why a cyclic topology achieves favorable results as verified by practitioners.
KW - Dynamical systems
KW - echo state networks (ESNs)
KW - recurrent neural networks (RNNs)
KW - reservoir computing
UR - http://www.scopus.com/inward/record.url?scp=85102289724&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2021.3059389
DO - 10.1109/TNNLS.2021.3059389
M3 - Article
C2 - 33651697
SN - 2162-237X
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
ER -