EEG-based Brain Computer Interface for Controlling a Drone - using machine learning, optimized for a battery powered device
MetadataVis full innførsel
Brain Computer Interfaces enable the use of brain waves to control computer-based external devices. It can, however, be difficult to trigger a specific brain wave, and it might not be repetitive enough to be used in such a manner. EEG is used to record brain waves, where the parts of the recorded signal that is related to eye movements are normally considered to be artifacts (undesired). In this work, however, EEG recordings of eye movements are used to create datasets for training a machine learning algorithm, used for controlling a drone in real time. A list of features is presented through literature search and comparison of eye movement plots. A non-linear SVM classifier and a greedy feature selection algorithm is used to distinguish between blinks, looking straight ahead, to the left, right, up and down. Data collected from two subjects is presented, obtaining an average accuracy of $94\%$ offline when training on both subjects. A state-machine was designed for controlling the drone and evaluating online classification. The down movement was discarded from the state machine due to observed degradation in online performance when including this class, online classification therefore only uses 5 eye movements. An average online accuracy of 94.5% was obtained when training on both subjects and testing separately. With proof of concept, the selected features and choice of algorithm is optimized for an implementation on a battery powered device with respect to energy efficiency and classifier accuracy. Execution time speedup is used as a measure for energy efficiency in this thesis, as this can be combined with a race-to-halt strategy to optimize for energy efficiency. A brute force search is utilized (together with the greedy feature selection algorithm) to find the best feature vector, presenting 9 features with a total extraction time of 242 us on a personal computer. Through this method, a speedup of 48x was achieved with a decrease of 2.1% in online accuracy when training and testing on Subject 1. The subjects are able to use the system to control a drone in real time, showing that eye movements can be used for devices demanding quick response. The system was observed to be delicate, demanding the subjects to stay very concentrated and for the electrodes to be positioned correctly, with good skin contact.