Show simple item record

dc.contributor.authorMoe, Signe
dc.contributor.authorRemonato, Filippo
dc.contributor.authorGrøtli, Esten Ingar
dc.contributor.authorGravdahl, Jan Tommy
dc.identifier.citationProceedings of Machine Learning Research (PMLR). 2020, 120 1-9.en_US
dc.description.abstractRecurrent Neural Networks (RNNs) have a form of memory where the output from a node at one timestep is fed back as input the next timestep in addition to data from the previous layer. This makes them highly suitable for timeseries analysis. However, standard RNNs have known weaknesses such as struggling with long-term memory. In this paper, we suggest a new recurrent network structure called Linear Antisymmetric RNN (LARNN). This structure is based on the numerical solution to an Ordinary Differential Equation (ODE) with stability properties resulting in a stable solution, which corresponds to long-term memory. Three different numerical methods are suggested to solve the ODE: Forward and Backward Euler and the midpoint method. The suggested structure has been implemented in Keras and several simulated datasets have been used to evaluate the performance. In the investigated cases, the LARNN performs better or similar to the Long Short Term Memory (LSTM) network which is the current state of the art for RNNs. Keywords: Recurrent Neural Network, Long-Term Memory, Timeseries Analysisen_US
dc.rightsNavngivelse 4.0 Internasjonal*
dc.titleLinear Antisymmetric Recurrent Neural Networksen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.source.journalProceedings of Machine Learning Research (PMLR)en_US
dc.relation.projectNorges forskningsråd: 294544en_US
dc.description.localcode© 2020 The Authors. This is an open access article under the CC BY license (

Files in this item


This item appears in the following Collection(s)

Show simple item record

Navngivelse 4.0 Internasjonal
Except where otherwise noted, this item's license is described as Navngivelse 4.0 Internasjonal