We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
A two‐step proximal‐point algorithm for the calculus of divergence‐based estimators in finite mixture models.
- Authors
Mohamad, Diaa Al; Broniatowski, Michel
- Abstract
Estimators derived from the expectation‐maximization (EM) algorithm are not robust since they are based on the maximization of the likelihood function. We propose an iterative proximal‐point algorithm based on the EM algorithm to minimize a divergence criterion between a mixture model and the unknown distribution that generates the data. The algorithm estimates in each iteration the proportions and the parameters of the mixture components in two separate steps. Resulting estimators are generally robust against outliers and misspecification of the model. Convergence properties of our algorithm are studied. The convergence of the introduced algorithm is discussed on a two‐component Weibull mixture entailing a condition on the initialization of the EM algorithm in order for the latter to converge. Simulations on Gaussian and Weibull mixture models using different statistical divergences are provided to confirm the validity of our work and the robustness of the resulting estimators against outliers in comparison to the EM algorithm. An application to a dataset of velocities of galaxies is also presented. The Canadian Journal of Statistics 47: 392–408; 2019 © 2019 Statistical Society of Canada
- Subjects
CANADA; EXPECTATION-maximization algorithms; GAUSSIAN mixture models; CALCULUS; INTEGRAL field spectroscopy; DIVERGENCE theorem; ALGORITHMS; MIXTURES
- Publication
Canadian Journal of Statistics, 2019, Vol 47, Issue 3, p392
- ISSN
0319-5724
- Publication type
Article
- DOI
10.1002/cjs.11500