We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Mutual-manifold regularized robust fast latent LRR for subspace recovery and learning.
- Authors
Li, Xianzhen; Zhang, Zhao; Zhang, Li; Wang, Meng
- Abstract
In this paper, we propose a simple yet effective low-rank representation (LRR) and subspace recovery model called mutual-manifold regularized robust fast latent LRR. Our model improves the representation ability and robustness from twofold. Specifically, our model is built on the Frobenius norm-based fast latent LRR decomposing given data into a principal feature part, a salient feature part and a sparse error, but improves it clearly by designing mutual-manifold regularization to encode, preserve and propagate local information between coefficients and salient features. The mutual-manifold regularization is defined by using the coefficients as the adaptive reconstruction weights for salient features and constructing a Laplacian matrix over salient features for the coefficients. Thus, some important local topology structure information can be propagated between them, which can make the discovered subspace structures and features potentially more accurate for the data representations. Besides, our approach also considers to improve the robust properties of subspace recovery against noise and sparse errors in coefficients, which is realized by decomposing original coefficients matrix into an error-corrected part and a sparse error part fitting noise in coefficients, and the recovered coefficients are then used for robust subspace recovery. Experimental results on several public databases demonstrate that our method can outperform other related algorithms.
- Subjects
LAPLACIAN matrices; FEATURE extraction; MATHEMATICAL regularization
- Publication
Neural Computing & Applications, 2020, Vol 32, Issue 17, p13363
- ISSN
0941-0643
- Publication type
Article
- DOI
10.1007/s00521-019-04688-7