[PDF][PDF] SGD-QN: Careful quasi-Newton stochastic gradient descent

A Bordes, L Bottou, P Gallinari - Journal of Machine Learning Research, 2009 - jmlr.org
Journal of Machine Learning Research, 2009jmlr.org
The SGD-QN algorithm is a stochastic gradient descent algorithm that makes careful use of
secondorder information and splits the parameter update into independently scheduled
components. Thanks to this design, SGD-QN iterates nearly as fast as a first-order stochastic
gradient descent but requires less iterations to achieve the same accuracy. This algorithm
won the “Wild Track” of the first PASCAL Large Scale Learning Challenge (Sonnenburg et
al., 2008).
Abstract
The SGD-QN algorithm is a stochastic gradient descent algorithm that makes careful use of secondorder information and splits the parameter update into independently scheduled components. Thanks to this design, SGD-QN iterates nearly as fast as a first-order stochastic gradient descent but requires less iterations to achieve the same accuracy. This algorithm won the “Wild Track” of the first PASCAL Large Scale Learning Challenge (Sonnenburg et al., 2008).
jmlr.org
Showing the best result for this search. See all results