We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Differentially private Riemannian optimization.
- Authors
Han, Andi; Mishra, Bamdev; Jawanpuria, Pratik; Gao, Junbin
- Abstract
In this paper, we study the differentially private empirical risk minimization problem where the parameter is constrained to a Riemannian manifold. We introduce a framework for performing differentially private Riemannian optimization by adding noise to the Riemannian gradient on the tangent space. The noise follows a Gaussian distribution intrinsically defined with respect to the Riemannian metric on the tangent space. We adapt the Gaussian mechanism from the Euclidean space to the tangent space compatible to such generalized Gaussian distribution. This approach presents a novel analysis as compared to directly adding noise on the manifold. We further prove privacy guarantees of the proposed differentially private Riemannian (stochastic) gradient descent using an extension of the moments accountant technique. Overall, we provide utility guarantees under geodesic (strongly) convex, general nonconvex objectives as well as under the Riemannian Polyak-Łojasiewicz condition. Empirical results illustrate the versatility and efficacy of the proposed framework in several applications.
- Subjects
METRIC spaces; RIEMANNIAN metric; GAUSSIAN distribution; GENERALIZED spaces; RANDOM noise theory; RIEMANNIAN manifolds
- Publication
Machine Learning, 2024, Vol 113, Issue 3, p1133
- ISSN
0885-6125
- Publication type
Article
- DOI
10.1007/s10994-023-06508-5