Sequential Subspace Clustering via Joint Capped $ℓ_2$ and $ℓ_{2,p}$ Norm Minimization with Convergence Guarantee

Author(s)

,
,
&

Abstract

Subspace clustering is a fundamental problem in machine learning that has attracted considerable attention in recent years. Most existing methods focus on designing effective models to regularize the coefficient matrix, often neglecting the impact of noise on subspace structures. However, real-world data are typically corrupted by noise, which can distort the underlying subspace structure. Additionally, for sequential data, a key challenge is the effective exploitation of temporal information. To address these issues, we propose a novel and robust sequential subspace clustering method, termed joint capped $ℓ_2$ and $ℓ_{2,p}$ norm minimization (JCLLM). The capped $ℓ_2$ norm-based loss function mitigates the influence of noise and outliers in regression, while the $ℓ_{2,p}$ norm regularization captures the temporal dependencies inherent in sequential data. We develop an iteratively reweighted optimization algorithm to solve the JCLLM model and prove its convergence to a stationary point. Extensive experiments on both synthetic and real-world datasets demonstrate that our method consistently outperforms several state-of-the-art subspace clustering approaches.

Author Biographies

  • Zhihui Tu

    School of Mathematical Sciences, Jiangxi Science and Technology Normal University, Nanchang 330038, China

  • Jian Lu

    School of Mathematical Sciences, Shenzhen University, Shenzhen 518060, China

  • Wenyu Hu

    School of Mathematics and Computer Sciences, Gannan Normal University, Ganzhou 341000, China 

    Key Laboratory of Data Science and Artificial Intelligence of Jiangxi Education Institutes, Ganzhou 341000, China

  • Shan Liu

    School of Mathematical Sciences, Jiangxi Science and Technology Normal University, Nanchang 330038, China

About this article

Abstract View

  • 1

Pdf View

  • 1

DOI

10.4208/nmtma.OA-2025-0081