Vis enkel innførsel

dc.contributor.advisorDowning, Keith
dc.contributor.authorVik, Mikael Eikrem
dc.date.accessioned2018-11-05T15:01:08Z
dc.date.available2018-11-05T15:01:08Z
dc.date.created2006-06-01
dc.date.issued2006
dc.identifierntnudaim:1526
dc.identifier.urihttp://hdl.handle.net/11250/2571091
dc.description.abstractThis thesis describes a connectionist approach to learning and long-term memory consolidation, inspired by empirical studies on the roles of the hippocampus and neocortex in the brain. The existence of complementary learning systems is due to demands posed on our cognitive system because of the nature of our experiences. It has been shown that dual-network architectures utilizing information transfer successfully can avoid the phenomenon of catastrophic forgetting involved in multiple sequence learning. The experiments involves a Reverberated Simple Recurrent Network which is trained on multiple sequences with the memory being reinforced by means of self-generated pseudopatterns. My focus will be on the implications of how differentiated learning speed affects the level of forgetting, without explicit training on the data used to form the existing memory.
dc.languageeng
dc.publisherNTNU
dc.subjectInformatikk, Kunstig intelligens og læring
dc.titleReducing catastrophic forgetting in neural networks using slow learning
dc.typeMaster thesis


Tilhørende fil(er)

Thumbnail
Thumbnail
Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel