Deep Semi-supervised Learning via Dynamic Anchor Graph Embedding in Latent Space

aut.filerelease.date2022-12-17
aut.relation.journalNeural Networksen_NZ
aut.researcherKassabov, Nikola
dc.contributor.authorTu, Een_NZ
dc.contributor.authorWang, Zen_NZ
dc.contributor.authorYang, Jen_NZ
dc.contributor.authorKasabov, Nen_NZ
dc.date.accessioned2022-02-08T02:44:12Z
dc.date.available2022-02-08T02:44:12Z
dc.date.copyright2021en_NZ
dc.date.issued2021en_NZ
dc.description.abstractRecently, deep semi-supervised graph embedding learning has drawn much attention for its appealing performance on the data with a pre-specified graph structure, which could be predefined or empirically constructed based on given data samples. However, the pre-specified graphs often contain considerable noisy/inaccurate connections and have a huge size for large datasets. Most existing embedding algorithms just take the graph off the shelf during the whole training stage and thus are easy to be misled by the inaccurate graph edges, as well as may result in large model size. In this paper, we attempt to address these issues by proposing a novel deep semi-supervised algorithm for simultaneous graph embedding and node classification, utilizing dynamic graph learning in neural network hidden layer space. Particularly, we construct an anchor graph to summarize the whole dataset using the hidden layer features of a consistency-constrained network. The anchor graph is used for sampling node neighborhood context, which is then presented together with node labels as contextual information to train an embedding network. The outputs of the consistency network and the embedding networks are finally concatenated together to pass a softmax function to perform node classification. The two networks are optimized jointly using both labeled and unlabeled data to minimize a single semi-supervised objective function, including a cross-entropy loss, a consistency loss and an embedding loss. Extensive experimental results on popular image and text datasets have shown that the proposed method is able to improve the performance of existing graph embedding and node classification methods, and outperform many state-of-the-art approaches on both types of datasets.
dc.identifier.citationNeural Networks, Volume 146, February 2022, Pages 350-360
dc.identifier.doi10.1016/j.neunet.2021.11.026en_NZ
dc.identifier.issn0893-6080en_NZ
dc.identifier.urihttps://hdl.handle.net/10292/14889
dc.languageenen_NZ
dc.publisherElsevier BVen_NZ
dc.relation.urihttps://www.sciencedirect.com/science/article/abs/pii/S0893608021004676
dc.rightsCopyright © 2022 Elsevier Ltd. All rights reserved. This is the author’s version of a work that was accepted for publication in (see Citation). Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. The definitive version was published in (see Citation). The original publication is available at (see Publisher's Version).
dc.rights.accessrightsOpenAccessen_NZ
dc.subjectGrap Embedding; Semi-Supervised Learning; Dynamic Anchor Graph; Image / Text Classification
dc.titleDeep Semi-supervised Learning via Dynamic Anchor Graph Embedding in Latent Spaceen_NZ
dc.typeJournal Article
pubs.elements-id445807
pubs.organisational-data/AUT
pubs.organisational-data/AUT/Faculty of Design & Creative Technologies
pubs.organisational-data/AUT/PBRF
pubs.organisational-data/AUT/PBRF/PBRF Design and Creative Technologies
pubs.organisational-data/AUT/PBRF/PBRF Design and Creative Technologies/PBRF ECMS
Files
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
AUT Grant of Licence for Tuwhera Jun 2021.pdf
Size:
360.95 KB
Format:
Adobe Portable Document Format
Description: