We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Model selection by bootstrap penalization for classification.
- Authors
Magalie Fromont
- Abstract
Abstract??We consider the binary classification problem. Given an i.i.d. sample drawn from the distribution of an ??{0,1}?valued random pair, we propose to estimate the so-calledBayes classifierby minimizing the sum of the empirical classification error and a penalty term based on Efron?s or i.i.d. weighted bootstrap samples of the data. We obtain exponential inequalities for such bootstrap type penalties, which allow us to derive non-asymptotic properties for the corresponding estimators. In particular, we prove that these estimators achieve the global minimax risk over sets of functions built from Vapnik-Chervonenkis classes. The obtained results generalize Koltchinskii (2001) and Bartlett et al.?s (2002) ones for Rademacher penalties that can thus be seen as special examples of bootstrap type penalties. To illustrate this, we carry out an experimental study in which we compare the different methods for an intervals model selection problem.
- Subjects
DISTRIBUTION (Probability theory); BAYESIAN analysis; STATISTICAL bootstrapping; EXPONENTIAL families (Statistics)
- Publication
Machine Learning, 2007, Vol 66, Issue 2-3, p165
- ISSN
0885-6125
- Publication type
Article
- DOI
10.1007/s10994-006-7679-y