全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
Mathematics  2012 

A Stochastic Gradient Method with an Exponential Convergence Rate for Finite Training Sets

Full-Text   Cite this paper   Add to My Lib

Abstract:

We propose a new stochastic gradient method for optimizing the sum of a finite set of smooth functions, where the sum is strongly convex. While standard stochastic gradient methods converge at sublinear rates for this problem, the proposed method incorporates a memory of previous gradient values in order to achieve a linear convergence rate. In a machine learning context, numerical experiments indicate that the new algorithm can dramatically outperform standard algorithms, both in terms of optimizing the training error and reducing the test error quickly.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133