%0 Journal Article %T Parallel Online Learning %A Daniel Hsu %A Nikos Karampatziakis %A John Langford %A Alex Smola %J Computer Science %D 2011 %I arXiv %X In this work we study parallelization of online learning, a core primitive in machine learning. In a parallel environment all known approaches for parallel online learning lead to delayed updates, where the model is updated using out-of-date information. In the worst case, or when examples are temporally correlated, delay can have a very adverse effect on the learning algorithm. Here, we analyze and present preliminary empirical results on a set of learning architectures based on a feature sharding approach that present various tradeoffs between delay, degree of parallelism, representation power and empirical performance. %U http://arxiv.org/abs/1103.4204v1