Show simple item record

dc.contributor.advisorDowning, Keith
dc.contributor.advisorAuerbach, Joshua
dc.contributor.authorGammelsæter, Martin
dc.date.accessioned2015-10-06T08:31:13Z
dc.date.available2015-10-06T08:31:13Z
dc.date.created2015-08-24
dc.date.issued2015
dc.identifierntnudaim:13954
dc.identifier.urihttp://hdl.handle.net/11250/2352342
dc.description.abstractIn many of the problem domains typically tackled by deep learning, data is plentiful and cheap but labeling of the data is tedious and expensive. Letting a model actively select the data instances it is uncertain about to train on and ignore others can reduce the percentage of instances that must be labeled to achieve satisfactory results. To this end, this project presents a novel semi-supervised active learning algorithm called Active Deep Dropout networks (ADD-networks). It is based on evaluating a deep neural network s uncertainty on unlabeled instances, through measuring disagreement within a committee of networks derived from the original network. The committee members are Monte-Carlo-sampled from the full network using the concept of dropout. Experiments on classifying handwritten digits show that ADD- networks are comparable to a state-of-the-art method, and vastly outperforms random selection of instances.
dc.languageeng
dc.publisherNTNU
dc.subjectDatateknologi, Intelligente systemer
dc.titleA Committee of One - Using Dropout for Active Learning in Deep Networks
dc.typeMaster thesis
dc.source.pagenumber78


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record