Online Learning in Vowpal Wabbit
Master thesis
Permanent lenke
http://hdl.handle.net/11250/2383183Utgivelsesdato
2015Metadata
Vis full innførselSamlinger
Sammendrag
Online learning methods for sequentially arriving data are growing in popularity. Alternative batch learning methods scale poorly and have memory constraints. The scope of this thesis is to study online learning methods that are based on stochastic gradient descent, or SGD, and are implemented in Vowpal Wabbit, an increasingly popular online learning software.The literature and experiments on these methods reveal that, despite scaling well, they are only designed for data originating from stationary models. This is an important weakness, as the data for which these models are necessary will often be nonstationary in nature.We propose a new framework that builds on the SGD algorithm. For every incoming example Parallellised SGD, or PSGD, runs alternative SGD-learners with different learning rates in parallell to a chosen SGD learner. The alternative learners help tune the chosen learning rate by sequentially comparing the errors of the learners. This provides a scalable framework as the gradient still only needs to be computed once per example, and the added computational cost of the alternative learners can be diminished through efficient parallelisation.Experiments on a proof-of-concept implementation demonstrate that PSGD is superior to Vowpal Wabbit's SGD-based implementations in nonstationary settings. However, further work is needed to improve the adaptiveness of the method to a wider range of nonstationary behaviour.Sustained research on this framework shows great promise to yield a class of adaptive learners that automatically handle nonstationary data and can be subject to large scale implementations in online learning softwares such as Vowpal Wabbit.