|ijphm_16_021.pdf||1.4 MB||September 14, 2016 - 8:08am|
It is of paramount importance to track the cognitive activity or cognitve attenion of the service personnel in a Prognostics and Health Monitoring (PHM) service related training or operation environment. The electroencephalography (EEG) data is one of the good candidates for cognitive activity recognition of the user. Analyzing electroencephalography (EEG) data in an unconstrained (natural) environment for understanding cognitive state and classifying human activity is a challenging task due to multiple reasons such as low signal-to-noise ratio, transient nature, lack of baseline availability and uncontrolled mixing of various tasks. This paper proposes a framework based on an emerging tool named deep learning that monitors human activity by fusing multiple EEG sensors and also selects a smaller sensor suite for a lean data collection system. Real-time classification of human activity from spatially non-collocated multi-probe EEG is executed by applying deep learning techniques without performing any significant amount of data preprocessing and manual feature engineering. Two types of deep neural networks, deep belief network (DBN) and deep convolutional neural network (DCNN) are used at the core of the proposed framework, which automatically learns necessary features from EEG for a given classification task. Validation on extensive amount of data, which was collected from several subjects while they were performing multiple tasks (listening and watching) in PHM service training session, is presented and significant parallels are drawn from existing domain knowledge on EEG data understanding. Comparison with machine learning benchmark techniques shows that deep learning based tools are better at understanding EEG data for task classification. It is observed via sensor selection that a significantly smaller EEG sensor suite can perform at a comparable accuracy as the original sensor suite.