We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Learning by extrapolation from marginal to full-multivariate probability distributions: decreasingly naive Bayesian classification.
- Authors
Webb, Geoffrey; Boughton, Janice; Zheng, Fei; Ting, Kai; Salem, Houssam
- Abstract
Averaged n-Dependence Estimators (A nDE) is an approach to probabilistic classification learning that learns by extrapolation from marginal to full-multivariate probability distributions. It utilizes a single parameter that transforms the approach between a low-variance high-bias learner (Naive Bayes) and a high-variance low-bias learner with Bayes optimal asymptotic error. It extends the underlying strategy of Averaged One-Dependence Estimators (AODE), which relaxes the Naive Bayes independence assumption while retaining many of Naive Bayes' desirable computational and theoretical properties. A nDE further relaxes the independence assumption by generalizing AODE to higher-levels of dependence. Extensive experimental evaluation shows that the bias-variance trade-off for Averaged 2-Dependence Estimators results in strong predictive accuracy over a wide range of data sets. It has training time linear with respect to the number of examples, learns in a single pass through the training data, supports incremental learning, handles directly missing values, and is robust in the face of noise. Beyond the practical utility of its lower-dimensional variants, A nDE is of interest in that it demonstrates that it is possible to create low-bias high-variance generative learners and suggests strategies for developing even more powerful classifiers.
- Subjects
APPROXIMATION theory; NUMERICAL analysis; MULTIVARIATE analysis; DISTRIBUTION (Probability theory); BAYESIAN analysis
- Publication
Machine Learning, 2012, Vol 86, Issue 2, p233
- ISSN
0885-6125
- Publication type
Article
- DOI
10.1007/s10994-011-5263-6