%0 Journal Article %T Fast Distributed Coordinate Descent for Non-Strongly Convex Losses %A Olivier Fercoq %A Zheng Qu %A Peter Richt¨¢rik %A Martin Tak¨¢£¿ %J Mathematics %D 2014 %I arXiv %X We propose an efficient distributed randomized coordinate descent method for minimizing regularized non-strongly convex loss functions. The method attains the optimal $O(1/k^2)$ convergence rate, where $k$ is the iteration counter. The core of the work is the theoretical study of stepsize parameters. We have implemented the method on Archer - the largest supercomputer in the UK - and show that the method is capable of solving a (synthetic) LASSO optimization problem with 50 billion variables. %U http://arxiv.org/abs/1405.5300v2