Linear Antisymmetric Recurrent Neural Networks
Peer reviewed, Journal article
Published version

Åpne
Permanent lenke
https://hdl.handle.net/11250/2725302Utgivelsesdato
2020Metadata
Vis full innførselSamlinger
Originalversjon
Proceedings of Machine Learning Research (PMLR). 2020, 120 1-9.Sammendrag
Recurrent Neural Networks (RNNs) have a form of memory where the output from a node at one timestep is fed back as input the next timestep in addition to data from the previous layer. This makes them highly suitable for timeseries analysis. However, standard RNNs have known weaknesses such as struggling with long-term memory. In this paper, we suggest a new recurrent network structure called Linear Antisymmetric RNN (LARNN). This structure is based on the numerical solution to an Ordinary Differential Equation (ODE) with stability properties resulting in a stable solution, which corresponds to long-term memory. Three different numerical methods are suggested to solve the ODE: Forward and Backward Euler and the midpoint method. The suggested structure has been implemented in Keras and several simulated datasets have been used to evaluate the performance. In the investigated cases, the LARNN performs better or similar to the Long Short Term Memory (LSTM) network which is the current state of the art for RNNs. Keywords: Recurrent Neural Network, Long-Term Memory, Timeseries Analysis