Boosting performance of incremental IDR/QR LDA - from sequential to chunk

Date
2011
Authors
Peng, Yiming
Supervisor
Fong, Alvis
Pang, Shaoning
Item type
Thesis
Degree name
Master of Computer and Information Sciences
Journal Title
Journal ISSN
Volume Title
Publisher
Auckland University of Technology
Abstract

Training data in the real world is often presented in random chunks. Yet existing sequential incremental IDR/QR LDA (sIncLDA) can only process data one instance after another. This thesis proposes a new chunk incremental IDR/QR LDA (cIncLDA) capable of processing multiple data instances at one time. sIncLDA updates the reduced within-class scatter matrix W by a QR decomposition of the centroid matrix for each newly-arrived data instance. It is assumed that the updated Q' ≈ Q for any data instance from an existing class and the updated W' ≈ W for any data instance from a new class. In practice, the assumption in sIncLDA leads to significant loss of the discriminative information from approximating Q and W when the number of classes is large. By utilizing a new method that accurately updates W, the proposed cIncLDA can better pr eserve the discriminative information contained in W. The limitation of sIncLDA is hence resolved. Experimental comparisons have been conducted on six facial datasets with diverse class numbers ranging from 40 to 1010. The result indicates that our algorithm achieves an competitive accuracy to batch QR/LDA and is consistently higher than sIncLDA. It is noted in the report that the computational complexity of our algorithm is more expensive than sIncLDA for single data processing (i.e., sequential manner); however, the efficiency of our algorithm surpasses sIncLDA as the chunk size increases for multiple instances processing (i.e., chunk manner).

Description
Keywords
Incremental learning , Chunk , Sequential , Linear Discriminant Analysis
Source
DOI
Publisher's version
Rights statement
Collections