Non-convex low-rank representation combined with rank-one matrix sum for subspace clustering
Rank (graph theory)
Block matrix
Representation
Matrix (chemical analysis)
DOI:
10.1007/s00500-020-04865-0
Publication Date:
2020-04-18T14:02:41Z
AUTHORS (5)
ABSTRACT
Exploring the multiple subspace structures of data such as low-rank representation is effective in subspace clustering. Non-convex low-rank representation (NLRR) via matrix factorization is one of the state-of-the-art techniques for subspace clustering. However, NLRR cannot scale to problems with large n (number of samples) as it requires either the inversion of an $$n\times n$$ matrix or solving an $$n\times n$$ linear system. To address this issue, we propose a novel approach, NLRR++, which reformulates NLRR as a sum of rank-one components, and apply a column-wise block coordinate descent to update each component iteratively. NLRR++ reduces the time complexity per iteration from $${\mathcal {O}}(n^3)$$ to $${\mathcal {O}}(mnd)$$ and the memory complexity from $${\mathcal {O}}(n^2)$$ to $${\mathcal {O}} (mn)$$ , where m is the dimensionality and d is the target rank (usually $$d\ll m\ll n$$ ). Our experimental results on simulations and real datasets have shown the efficiency and effectiveness of NLRR++. We demonstrate that NLRR++ is not only much faster than NLRR, but also scalable to large datasets such as the ImageNet dataset with 120K samples.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (35)
CITATIONS (2)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....