We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Additive regularization of topic models.
- Authors
Potapenko, Anna; Vorontsov, Konstantin
- Abstract
Probabilistic topic modeling of text collections has been recently developed mainly within the framework of graphical models and Bayesian inference. In this paper we introduce an alternative semi-probabilistic approach, which we call additive regularization of topic models (ARTM). Instead of building a purely probabilistic generative model of text we regularize an ill-posed problem of stochastic matrix factorization by maximizing a weighted sum of the log-likelihood and additional criteria. This approach enables us to combine probabilistic assumptions with linguistic and problem-specific requirements in a single multi-objective topic model. In the theoretical part of the work we derive the regularized EM-algorithm and provide a pool of regularizers, which can be applied together in any combination. We show that many models previously developed within Bayesian framework can be inferred easier within ARTM and in some cases generalized. In the experimental part we show that a combination of sparsing, smoothing, and decorrelation improves several quality measures at once with almost no loss of the likelihood.
- Subjects
PROBABILISTIC generative models; EXPECTATION-maximization algorithms; TEXT mining; MATHEMATICAL regularization; BAYESIAN analysis; LATENT semantic analysis; DATA modeling
- Publication
Machine Learning, 2015, Vol 101, Issue 1-3, p303
- ISSN
0885-6125
- Publication type
Article
- DOI
10.1007/s10994-014-5476-6