Untangling the Relation Between Microsaccades, Gamma-Band Synchronisation and Visual Perception in Human EEG Signal using Automated Machine Learning (PhD)

Electroencephalographic (EEG) signals allow completely non-invasive investigation of an awake human brain. The goal of computational neuroscience is to better understand the mechanisms by which a brain functions. Modern machine learning algorithms can attempt to uncover patterns within complex datasets and learn to differentiate between different states. Here, we propose training support vector machine (SVM) algorithms on EEG data recorded while the subject was in different cognitive states of visual perception – i.e. reporting being aware of seeing an object or not, given same stimuli. Prior work on similar use of SVMs to classify motor intent based on EEG data proved usefully accurate [over 90% success – Andrew Stewart MSc project], and there are examples in the literature of EEG being used to track visual perception [Johnson, 2005; Supp, 2007]. We hypothesise that the information content within the EEG data could be probed by comparing cognitive state prediction accuracy with and without selectively filtering certain components of the EEG (such as gamma band ~40Hz synchronisation). We suggest that this could greatly inform top-down models of visual perception.

Related Themes

Related People