Brain-computer interfaces for virtual Quadcopters based on a spiking-neural network architecture - NeuCube
A novel framework is proposed in this study that uses a spiking neural network for learning spatio-temporal and spectro-temporal data called NeuCube. It is capable of learning and classifying such data in real time (online).
NeuCube-based methodology is proposed, tested and implemented for a control of Quadcopter using brain signals. A Quadcopter is a flying drone, which is used for its stability and ability to carry heavy loads for practical applications. There are three types of movement in flying drones/flight: Pitch, Yaw and Roll. These movements are controlled by a small controller called “Flight Controller”, which has its sensors such as 3-axis gyros, accelerometer, barometer and many more, which ensure a stabilised flying drone. Usually these are controlled by a radio control kit. In this study the path chosen is to control a Quadcopter with brain data.
In this study, a 14-channel Emotiv EPOC EEG device was used. In this experiment rather than using a real Quadcopter, the author uses a virtual environment to move a virtual drone in four directions. For each direction a facial movement was used to produce and record EEG data: Left eyebrow up, Right eyebrow up, both eyebrows up and both eyebrows down. NeuCube uses DeSNN (Dynamic Evolving Spiking Neural Network) to classify the data and to send the commands to virtual Quadcopter to move.
This experiment was tested during which the system showed good results.