Evolving Spatio-temporal Data Machines Based on the NeuCube Neuromorphic Framework: Design Methodology and Selected Applications

Kasabov, N
Scott, N
Tu, E
Marks, S
Sengupta, S
Capecci, E
Othman, M
Doborjeh, M
Murli, N
Hartono, R
Item type
Journal Article
Degree name
Journal Title
Journal ISSN
Volume Title

The paper describes a new type of evolving connectionist systems (ECOS) called evolving spatio-temporal data machines based on neuromorphic, brain-like information processing principles (eSTDM). These are multi-modular computer systems designed to deal with large and fast spatio/spectro temporal data using spiking neural networks (SNN) as major processing modules. ECOS and eSTDM in particular can learn incrementally from data streams, can include ‘on the fly’ new input variables, new output class labels or regression outputs, can continuously adapt their structure and functionality, can be visualised and interpreted for new knowledge discovery and for a better understanding of the data and the processes that generated it. eSTDM can be used for early event prediction due to the ability of the SNN to spike early, before whole input vectors (they were trained on) are presented. A framework for building eSTDM called NeuCube along with a design methodology for building eSTDM using this are presented. The implementation of this framework in MATLAB, Java, and PyNN (Python) is presented. The latter facilitates the use of neuromorphic hardware platforms to run the eSTDM. Selected examples are given of eSTDM for pattern recognition and early event prediction on EEG data, fMRI data, multisensory seismic data, ecological data, climate data, audio-visual data. Future directions are discussed, including extension of the NeuCube framework for building neurogenetic eSTDM and also new applications of eSTDM.

Spatio/spectro temporal data; Evolving Connectionist Systems; Evolving Spiking Neural Networks; Computational Neurogenetic Systems; Quantum inspired spiking neural networks; Evolving spatio-temporal data machines; NeuCube
Neural Networks: Special Issue on Learning in Big Data [Preprint submitted to Neural Networks]
Rights statement
Authors pre-print on any website, including arXiv and RePEC.