Repository logo
 

Deep Inductive and Scalable Subspace Clustering via Nonlocal Contrastive Self-distillation

aut.relation.endpage1
aut.relation.issue99
aut.relation.journalIEEE Transactions on Circuits and Systems for Video Technology
aut.relation.startpage1
aut.relation.volumePP
dc.contributor.authorZhu, W
dc.contributor.authorPeng, B
dc.contributor.authorYan, WeiQi
dc.date.accessioned2026-01-14T23:25:38Z
dc.date.available2026-01-14T23:25:38Z
dc.date.issued2025-09-24
dc.description.abstractDeep subspace clustering has demonstrated remarkable results by leveraging the nonlinear subspace assumption. However, it often encounters challenges in terms of computational cost and memory footprint in dealing with large-scale data due to its traditional single-batch training strategy. To address this issue, this paper proposes a deep subspace clustering framework that is regularized by nonlocal contrastive self-distillation, enabling a Deep Inductive and Scalable Subspace Clustering (DISSC) algorithm. In particular, our framework incorporates two subspace learning modules, namely subspace learning based on self-expression model and inductive subspace clustering. These modules generate affinities from different perspectives by extracting intermediate features from two augmentations of the input data using a weight-sharing neural network. By integrating the concept of self-distillation, our framework effectively exploits the clustering-friendly knowledge contained in these two affinities through a novel nonlocal contrastive prediction task, employing an empirical yet effective threshold. This allows the framework to facilitate complementary knowledge mining and scalability without compromising clustering performance. With an alternate branch that bypasses the self-expression computation, our framework can infer subspace membership of the out-of-sample data through the predicted soft labels, eliminating the need for ad-hoc postprocessing. In addition, the self-expression matrix computed using mini-batch data benefits from the distilled knowledge obtained from the inductive subspace clustering module, enabling our framework to scale to data of arbitrary size. Experiments conducted on large-scale MNIST, Fashion-MINST, STL-10, CIFAR-10 and Stanford Online Products datasets validate the superiority of the proposed DISSC algorithm over state-of-the-art subspace clustering methods.
dc.identifier.citationIEEE Transactions on Circuits and Systems for Video Technology, ISSN: 1051-8215 (Print); 1558-2205 (Online), Institute of Electrical and Electronics Engineers (IEEE), PP(99), 1-1. doi: 10.1109/TCSVT.2025.3613980
dc.identifier.doi10.1109/TCSVT.2025.3613980
dc.identifier.issn1051-8215
dc.identifier.issn1558-2205
dc.identifier.urihttp://hdl.handle.net/10292/20500
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.relation.urihttps://ieeexplore.ieee.org/document/11177597
dc.rightsThis is the Author's Accepted Manuscript of an article published in IEEE Transactions on Circuits and Systems for Video Technology. The Version of Record will be available at DOI: 10.1109/TCSVT.2025.3613980
dc.rights.accessrightsOpenAccess
dc.subject46 Information and Computing Sciences
dc.subject4611 Machine Learning
dc.subjectBioengineering
dc.subjectNetworking and Information Technology R&D (NITRD)
dc.subjectMachine Learning and Artificial Intelligence
dc.subject0801 Artificial Intelligence and Image Processing
dc.subject0906 Electrical and Electronic Engineering
dc.subjectArtificial Intelligence & Image Processing
dc.subject4006 Communications engineering
dc.subject4009 Electronics, sensors and digital hardware
dc.subject4603 Computer vision and multimedia computation
dc.titleDeep Inductive and Scalable Subspace Clustering via Nonlocal Contrastive Self-distillation
dc.typeJournal Article
pubs.elements-id632762

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Deep_Inductive_and_Scalable_Subspace_Clustering_via_Nonlocal_Contrastive_Self-Distillation.pdf
Size:
2.77 MB
Format:
Adobe Portable Document Format
Description:
Author's Accepted Manuscript

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.37 KB
Format:
Plain Text
Description: