Deep learning with cellular automaton-based reservoir computing
Journal article, Peer reviewed
Published version
Åpne
Permanent lenke
http://hdl.handle.net/11250/2497868Utgivelsesdato
2017Metadata
Vis full innførselSamlinger
Sammendrag
Recurrent neural networks (RNNs) have been a prominent concept within artificial intelligence. They are inspired by biological neural networks (BNNs) and provide an intuitive and abstract representation of how BNNs work. Derived from the more generic artificial neural networks (ANNs), the recurrent ones are meant to be used for temporal tasks, such as speech recognition, because they are capable of memorizing historic input. However, such networks are very time consuming to train as a result of their inherent nature. Recently, echo state networks and liquid state machines have been proposed as possible RNN alternatives, under the name of reservoir computing (RC). Reservoir computers are far easier to train. In this paper, cellular automata (CAs) are used as a reservoir and are tested on the five-bit memory task (a well-known benchmark within the RC community). The work herein provides a method of mapping binary inputs from the task onto the automata and a recurrent architecture for handling the sequential aspects. Furthermore, a layered (deep) reservoir architecture is proposed. Performances are compared to earlier work, in addition to the single-layer version. Results show that the single cellular automaton (CA) reservoir system yields similar results to state-of-the-art work. The system comprised of two layered reservoirs does show a noticeable improvement compared to a single CA reservoir. This work lays the foundation for implementations of deep learning with CA-based reservoir systems.