%0 Journal Article %T Estimated VC dimension for risk bounds %A Daniel J. McDonald %A Cosma Rohilla Shalizi %A Mark Schervish %J Statistics %D 2011 %I arXiv %X Vapnik-Chervonenkis (VC) dimension is a fundamental measure of the generalization capacity of learning algorithms. However, apart from a few special cases, it is hard or impossible to calculate analytically. Vapnik et al. [10] proposed a technique for estimating the VC dimension empirically. While their approach behaves well in simulations, it could not be used to bound the generalization risk of classifiers, because there were no bounds for the estimation error of the VC dimension itself. We rectify this omission, providing high probability concentration results for the proposed estimator and deriving corresponding generalization bounds. %U http://arxiv.org/abs/1111.3404v1