%0 Journal Article %T Fast learning rates in statistical inference through aggregation %A Jean-Yves Audibert %J Statistics %D 2009 %I arXiv %R 10.1214/08-AOS623 %X We develop minimax optimal risk bounds for the general learning task consisting in predicting as well as the best function in a reference set $\mathcal{G}$ up to the smallest possible additive term, called the convergence rate. When the reference set is finite and when $n$ denotes the size of the training data, we provide minimax convergence rates of the form $C(\frac{\log|\mathcal{G}|}{n})^v$ with tight evaluation of the positive constant $C$ and with exact $0