on polynomial time methods for exact low rank tensor completion
FOS: Computer and information sciences
Matrix completion
Computer Science - Machine Learning
Computer Science - Information Theory
Information Theory (cs.IT)
Machine Learning (stat.ML)
Machine Learning (cs.LG)
Nonconvex optimization
Statistics - Machine Learning
Concentration inequality
Tensor rank
518
U-statistics
Tensor completion
Polynomial time complexity
DOI:
10.48550/arxiv.1702.06980
Publication Date:
2019-01-07
AUTHORS (2)
ABSTRACT
56 pages, 4 figures<br/>In this paper, we investigate the sample size requirement for exact recovery of a high order tensor of low rank from a subset of its entries. We show that a gradient descent algorithm with initial value obtained from a spectral method can, in particular, reconstruct a ${d\times d\times d}$ tensor of multilinear ranks $(r,r,r)$ with high probability from as few as $O(r^{7/2}d^{3/2}\log^{7/2}d+r^7d\log^6d)$ entries. In the case when the ranks $r=O(1)$, our sample size requirement matches those for nuclear norm minimization (Yuan and Zhang, 2016a), or alternating least squares assuming orthogonal decomposability (Jain and Oh, 2014). Unlike these earlier approaches, however, our method is efficient to compute, easy to implement, and does not impose extra structures on the tensor. Numerical results are presented to further demonstrate the merits of the proposed approach.<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....