TY - GEN
T1 - Efficient dense subspace clustering
AU - Ji, Pan
AU - Salzmann, Mathieu
AU - Li, Hongdong
PY - 2014
Y1 - 2014
N2 - In this paper, we tackle the problem of clustering data points drawn from a union of linear (or affine) subspaces. To this end, we introduce an efficient subspace clustering algorithm that estimates dense connections between the points lying in the same subspace. In particular, instead of following the standard compressive sensing approach, we formulate subspace clustering as a Frobenius norm minimization problem, which inherently yields denser con- nections between the data points. While in the noise-free case we rely on the self-expressiveness of the observations, in the presence of noise we simultaneously learn a clean dictionary to represent the data. Our formulation lets us address the subspace clustering problem efficiently. More specifically, the solution can be obtained in closed-form for outlier-free observations, and by performing a series of linear operations in the presence of outliers. Interestingly, we show that our Frobenius norm formulation shares the same solution as the popular nuclear norm minimization approach when the data is free of any noise, or, in the case of corrupted data, when a clean dictionary is learned. Our experimental evaluation on motion segmentation and face clustering demonstrates the benefits of our algorithm in terms of clustering accuracy and efficiency.
AB - In this paper, we tackle the problem of clustering data points drawn from a union of linear (or affine) subspaces. To this end, we introduce an efficient subspace clustering algorithm that estimates dense connections between the points lying in the same subspace. In particular, instead of following the standard compressive sensing approach, we formulate subspace clustering as a Frobenius norm minimization problem, which inherently yields denser con- nections between the data points. While in the noise-free case we rely on the self-expressiveness of the observations, in the presence of noise we simultaneously learn a clean dictionary to represent the data. Our formulation lets us address the subspace clustering problem efficiently. More specifically, the solution can be obtained in closed-form for outlier-free observations, and by performing a series of linear operations in the presence of outliers. Interestingly, we show that our Frobenius norm formulation shares the same solution as the popular nuclear norm minimization approach when the data is free of any noise, or, in the case of corrupted data, when a clean dictionary is learned. Our experimental evaluation on motion segmentation and face clustering demonstrates the benefits of our algorithm in terms of clustering accuracy and efficiency.
UR - http://www.scopus.com/inward/record.url?scp=84904615403&partnerID=8YFLogxK
U2 - 10.1109/WACV.2014.6836065
DO - 10.1109/WACV.2014.6836065
M3 - Conference contribution
SN - 9781479949854
T3 - 2014 IEEE Winter Conference on Applications of Computer Vision, WACV 2014
SP - 461
EP - 468
BT - 2014 IEEE Winter Conference on Applications of Computer Vision, WACV 2014
PB - IEEE Computer Society
T2 - 2014 IEEE Winter Conference on Applications of Computer Vision, WACV 2014
Y2 - 24 March 2014 through 26 March 2014
ER -