We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator.
- Authors
Nguyen, Viet Anh; Kuhn, Daniel; Mohajerin Esfahani, Peyman
- Abstract
Note. The best result in each experiment is highlighted in bold.The optimal solutions of many decision problems such as the Markowitz portfolio allocation and the linear discriminant analysis depend on the inverse covariance matrix of a Gaussian random vector. In "Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator," Nguyen, Kuhn, and Mohajerin Esfahani propose a distributionally robust inverse covariance estimator, obtained by robustifying the Gaussian maximum likelihood problem with a Wasserstein ambiguity set. In the absence of any prior structural information, the estimation problem has an analytical solution that is naturally interpreted as a nonlinear shrinkage estimator. Besides being invertible and well conditioned, the new shrinkage estimator is rotation equivariant and preserves the order of the eigenvalues of the sample covariance matrix. If there are sparsity constraints, which are typically encountered in Gaussian graphical models, the estimation problem can be solved using a sequential quadratic approximation algorithm. We introduce a distributionally robust maximum likelihood estimation model with a Wasserstein ambiguity set to infer the inverse covariance matrix of a p-dimensional Gaussian random vector from n independent samples. The proposed model minimizes the worst case (maximum) of Stein's loss across all normal reference distributions within a prescribed Wasserstein distance from the normal distribution characterized by the sample mean and the sample covariance matrix. We prove that this estimation problem is equivalent to a semidefinite program that is tractable in theory but beyond the reach of general-purpose solvers for practically relevant problem dimensions p. In the absence of any prior structural information, the estimation problem has an analytical solution that is naturally interpreted as a nonlinear shrinkage estimator. Besides being invertible and well conditioned even for p>n , the new shrinkage estimator is rotation equivariant and preserves the order of the eigenvalues of the sample covariance matrix. These desirable properties are not imposed ad hoc but emerge naturally from the underlying distributionally robust optimization model. Finally, we develop a sequential quadratic approximation algorithm for efficiently solving the general estimation problem subject to conditional independence constraints typically encountered in Gaussian graphical models.
- Subjects
FISHER discriminant analysis; MATRIX inversion; RANDOM matrices; COVARIANCE matrices; MAXIMUM likelihood statistics
- Publication
Operations Research, 2022, Vol 70, Issue 1, p490
- ISSN
0030-364X
- Publication type
Article
- DOI
10.1287/opre.2020.2076