We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Nonparametric Tensor Completion Based on Gradient Descent and Nonconvex Penalty.
- Authors
Xu, Kai; Xiong, Zhi
- Abstract
Existing tensor completion methods all require some hyperparameters. However, these hyperparameters determine the performance of each method, and it is difficult to tune them. In this paper, we propose a novel nonparametric tensor completion method, which formulates tensor completion as an unconstrained optimization problem and designs an efficient iterative method to solve it. In each iteration, we not only calculate the missing entries by the aid of data correlation, but consider the low-rank of tensor and the convergence speed of iteration. Our iteration is based on the gradient descent method, and approximates the gradient descent direction with tensor matricization and singular value decomposition. Considering the symmetry of every dimension of a tensor, the optimal unfolding direction in each iteration may be different. So we select the optimal unfolding direction by scaled latent nuclear norm in each iteration. Moreover, we design formula for the iteration step-size based on the nonconvex penalty. During the iterative process, we store the tensor in sparsity and adopt the power method to compute the maximum singular value quickly. The experiments of image inpainting and link prediction show that our method is competitive with six state-of-the-art methods.
- Subjects
SINGULAR value decomposition
- Publication
Symmetry (20738994), 2019, Vol 11, Issue 12, p1512
- ISSN
2073-8994
- Publication type
Article
- DOI
10.3390/sym11121512