All Title Author
Keywords Abstract

Statistics  2009 

Fast learning rates in statistical inference through aggregation

DOI: 10.1214/08-AOS623

Full-Text   Cite this paper   Add to My Lib


We develop minimax optimal risk bounds for the general learning task consisting in predicting as well as the best function in a reference set $\mathcal{G}$ up to the smallest possible additive term, called the convergence rate. When the reference set is finite and when $n$ denotes the size of the training data, we provide minimax convergence rates of the form $C(\frac{\log|\mathcal{G}|}{n})^v$ with tight evaluation of the positive constant $C$ and with exact $0


comments powered by Disqus