We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
A weighted voting framework for classifiers ensembles.
- Authors
Kuncheva, Ludmila; Rodríguez, Juan
- Abstract
We propose a probabilistic framework for classifier combination, which gives rigorous optimality conditions (minimum classification error) for four combination methods: majority vote, weighted majority vote, recall combiner and the naive Bayes combiner. The framework is based on two assumptions: class-conditional independence of the classifier outputs and an assumption about the individual accuracies. The four combiners are derived subsequently from one another, by progressively relaxing and then eliminating the second assumption. In parallel, the number of the trainable parameters increases from one combiner to the next. Simulation studies reveal that if the parameter estimates are accurate and the first assumption is satisfied, the order of preference of the combiners is: naive Bayes, recall, weighted majority and majority. By inducing label noise, we expose a caveat coming from the stability-plasticity dilemma. Experimental results with 73 benchmark data sets reveal that there is no definitive best combiner among the four candidates, giving a slight preference to naive Bayes. This combiner was better for problems with a large number of fairly balanced classes while weighted majority vote was better for problems with a small number of unbalanced classes.
- Subjects
ERROR analysis in mathematics; PROBABILISTIC databases; VOTING; SIMULATION methods &; models; NAIVE Bayes classification; DATA analysis
- Publication
Knowledge & Information Systems, 2014, Vol 38, Issue 2, p259
- ISSN
0219-1377
- Publication type
Article
- DOI
10.1007/s10115-012-0586-6