Hierarchical Data Classification Using Deep Neural Networks

aut.relation.pages10
aut.relation.volume3en_NZ
aut.researcherNarayanan, Ajit
dc.contributor.authorTirimula, SSen_NZ
dc.contributor.authorNarayanan, Aen_NZ
dc.contributor.editorArik, Sen_NZ
dc.contributor.editorHuang, Ten_NZ
dc.contributor.editorLai, WKen_NZ
dc.contributor.editorLiu, Qen_NZ
dc.date.accessioned2017-10-15T22:52:26Z
dc.date.available2017-10-15T22:52:26Z
dc.date.copyright2015en_NZ
dc.date.issued2015en_NZ
dc.description.abstractDeep Neural Networks (DNNs) are becoming an increasingly interesting, valuable and efficient machine learning paradigm with implementations in natural language processing, image recognition and hand-written character recognition. Application of deep architectures is increasing in domains that contain feature hierarchies (FHs) i.e. features from higher levels of the hierarchy formed by the composition of lower level features. This is because of a perceived relationship between on the one hand the hierarchical organisation of DNNs, with large numbers of neurons at the bottom layers and increasingly smaller numbers at upper layers, and on the other hand FHs, with comparatively large numbers of low level features resulting in a small number of high level features. However, it is not clear what the relationship between DNNs hierarchies and FHs should be, or whether there even exists one. Nor is it clear whether modelling FHs with a hierarchically organised DNN conveys any benefits over using non-hierarchical neural networks. This study is aimed at exploring these questions and is organized into two parts. Firstly, a taxonomic FH with associated data is generated and a DNN is trained to classify the organisms into various species depending on characteristic features. The second part involves testing the ability of DNNs to identify whether two given organisms are related or not, depending on the sharing of appropriate features in their FHs. The experimental results show that the accuracy of the classification results is reduced with the increase in ‘depth’. Further, improved performance was achieved when every hidden layer has the same number of nodes compared with DNNs with increasingly fewer hidden nodes at higher levels. In other words, our experiments show that the relationship between DNNs and FHs is not simple and may require further extensive experimental research to identify the best DNN architectures when learning FHs.
dc.identifier.citationIn International Conference on Neural Information Processing (pp. 492-500). Springer, Cham.en_NZ
dc.identifier.doi10.1007/978-3-319-26555-1en_NZ
dc.identifier.isbn978-3-319-26554-4en_NZ
dc.identifier.urihttps://hdl.handle.net/10292/10870
dc.publisherSpringeren_NZ
dc.relation.urihttp://www.springer.com/us/book/9783319265544en_NZ
dc.rightsAn author may self-archive an author-created version of his/her article on his/her own website and or in his/her institutional repository. He/she may also deposit this version on his/her funder’s or funder’s designated repository at the funder’s request or as a result of a legal obligation, provided it is not made publicly available until 12 months after official publication. He/ she may not use the publisher's PDF version, which is posted on www.springerlink.com, for the purpose of self-archiving or deposit. Furthermore, the author may only post his/her version provided acknowledgement is given to the original source of publication and a link is inserted to the published article on Springer's website. The link must be accompanied by the following text: "The final publication is available at www.springerlink.com”. (Please also see Publisher’s Version and Citation).
dc.rights.accessrightsOpenAccessen_NZ
dc.subjectDeep Neural Networks; Hierarchical data classification
dc.titleHierarchical Data Classification Using Deep Neural Networksen_NZ
dc.typeConference Contribution
pubs.elements-id194065
pubs.organisational-data/AUT
pubs.organisational-data/AUT/Design & Creative Technologies
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Hierarchial _ ICONIP.pdf
Size:
115.83 KB
Format:
Adobe Portable Document Format
Description:
Conference contribution
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
RE4.10 Grant of Licence.docx
Size:
14.05 KB
Format:
Microsoft Word 2007+
Description: