Optimizing Echo State Networks for Learning Temporal Sequences
Abstract
Echo state networks are a new type of artificial neural network. It is created of a recurrent neural network with input, output and back weights. What separates echo state networks from other recurrent neural networks is that only the output weights are trained, making training much simpler than earlier methods. This fact together with the recurrent neural networks memory makes the network great at learning temporal sequences. In this thesis the echo state network was used to learn and predict several different temporal sequences. The result of this tests showed that echo state network can learn and predict complicated temporal sequences. But it also showed that keeping the reservoir in a stable state is often hard, especially when using its own predictions. Other test to investigate how noise could stabilize the system showed some improvements on handling almost nonexistent errors that slowly sent the system into unstable states, but it did little to diminish instability crated by small regular deviations. In the test where specific recurrent neural networks where developed, an increase in performance was shown for some datasets. But it also became apparent that for practical use several aspects of the process must be changed. The process was to resource demanding making it weary slow, and the focus on minimizing the square error led to undesirable result in several cases.