TY - GEN
T1 - Echo state networks with decoupled reservoir states
AU - Zhang, Bai
AU - Wang, Yue
PY - 2008/12/1
Y1 - 2008/12/1
N2 - Echo state networks (ESNs) are a novel form of recurrent neural networks that provide an efficient and powerful computational model to approximate dynamic nonlinear systems. Why a random, large, fixed recurrent neural network (reservoir) has such astonishing performance in approximating nonlinear systems remains a mystery. In this paper, we first compare two reservoir scenarios in ESNs, i.e. sparsely versus fully connected reservoirs, and show that the eigenvalues of these reservoir weight matrices have the same limit distribution in the complex plane. We discuss the link between the eigenvalues of the reservoir weight matrix and the ESN approximation ability in a simplified ESN case. We propose a new ESN with decoupled reservoir states, in which the neurons in the reservoir are decoupled into single or pairs of neurons. A reservoir state back-elimination strategy is presented, which not only reduces model complexity but also increases numerical stability when calculating the output weights. The proposed model is tested in a communication channel equalization problem and applied to gene expression time series modeling with very promising results.
AB - Echo state networks (ESNs) are a novel form of recurrent neural networks that provide an efficient and powerful computational model to approximate dynamic nonlinear systems. Why a random, large, fixed recurrent neural network (reservoir) has such astonishing performance in approximating nonlinear systems remains a mystery. In this paper, we first compare two reservoir scenarios in ESNs, i.e. sparsely versus fully connected reservoirs, and show that the eigenvalues of these reservoir weight matrices have the same limit distribution in the complex plane. We discuss the link between the eigenvalues of the reservoir weight matrix and the ESN approximation ability in a simplified ESN case. We propose a new ESN with decoupled reservoir states, in which the neurons in the reservoir are decoupled into single or pairs of neurons. A reservoir state back-elimination strategy is presented, which not only reduces model complexity but also increases numerical stability when calculating the output weights. The proposed model is tested in a communication channel equalization problem and applied to gene expression time series modeling with very promising results.
UR - http://www.scopus.com/inward/record.url?scp=58049171366&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=58049171366&partnerID=8YFLogxK
U2 - 10.1109/MLSP.2008.4685521
DO - 10.1109/MLSP.2008.4685521
M3 - Conference contribution
AN - SCOPUS:58049171366
SN - 9781424423767
T3 - Proceedings of the 2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008
SP - 444
EP - 449
BT - Proceedings of the 2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008
T2 - 2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008
Y2 - 16 October 2008 through 19 October 2008
ER -