%0 Journal Article %T A Stochastic Gradient Method with an Exponential Convergence Rate for Finite Training Sets %A Nicolas Le Roux %A Mark Schmidt %A Francis Bach %J Mathematics %D 2012 %I arXiv %X We propose a new stochastic gradient method for optimizing the sum of a finite set of smooth functions, where the sum is strongly convex. While standard stochastic gradient methods converge at sublinear rates for this problem, the proposed method incorporates a memory of previous gradient values in order to achieve a linear convergence rate. In a machine learning context, numerical experiments indicate that the new algorithm can dramatically outperform standard algorithms, both in terms of optimizing the training error and reducing the test error quickly. %U http://arxiv.org/abs/1202.6258v4