Sequential Subspace Clustering via Joint Capped $ℓ_2$ and $ℓ_{2,p}$ Norm Minimization with Convergence Guarantee
Abstract
Subspace clustering is a fundamental problem in machine learning that has attracted considerable attention in recent years. Most existing methods focus on designing effective models to regularize the coefficient matrix, often neglecting the impact of noise on subspace structures. However, real-world data are typically corrupted by noise, which can distort the underlying subspace structure. Additionally, for sequential data, a key challenge is the effective exploitation of temporal information. To address these issues, we propose a novel and robust sequential subspace clustering method, termed joint capped $ℓ_2$ and $ℓ_{2,p}$ norm minimization (JCLLM). The capped $ℓ_2$ norm-based loss function mitigates the influence of noise and outliers in regression, while the $ℓ_{2,p}$ norm regularization captures the temporal dependencies inherent in sequential data. We develop an iteratively reweighted optimization algorithm to solve the JCLLM model and prove its convergence to a stationary point. Extensive experiments on both synthetic and real-world datasets demonstrate that our method consistently outperforms several state-of-the-art subspace clustering approaches.
About this article