IEEE Access | 2021

LiHEA: Migrating EEG Analytics to Ultra-Edge IoT Devices With Logic-in-Headbands

 
 
 
 

Abstract


Traditional cloud computing of raw Electroencephalogram (EEG) data, particularly for continuous monitoring use-cases, consumes precious network resources and contributes to delay. Motivated by the paradigm shift of edge computing and Internet of Things (IoT) for continuous monitoring, we focus on this paper on the first step to carry out EEG edge analytics at the last frontier (i.e., the ultra-edge) of our considered cyber-physical system for ensuring users’ convenience and privacy. To overcome challenges due to computational and energy resource constraints of IoT devices (e.g., EEG headbands/headsets), in this paper, we envision a smart, lightweight model, referred to as Logic-in-Headbands based Edge Analytics (LiHEA), which can be seamlessly incorporated with the consumer-grade EEG headsets to reduce delay and bandwidth consumption. By systematically investigating various traditional machine and deep learning models, we identify and select the best model for our envisioned LiHEA. We consider a use-case for detecting confusion, representing levels of distraction, during online course delivery which has become pervasive during the novel coronavirus (COVID-19) pandemic. We apply a unique feature selection technique to find out which features are triggered with confusion where delta waves, attention, and theta waves were announced as the three most important features. Among various traditional machine and deep learning models, our customized random forest model demonstrated the highest accuracy of 90%. Since the dataset size might have impacted the performance of deep learning-based approaches, we further apply the deep convolutional generative adversarial network (DCGAN) to generate synthetic traces with representative samples of the original EEG data, and thereby enhance the variation in the data. While the performances of the deep learning models significantly increase after the data augmentation, they still cannot outperform the random forest model. Furthermore, computational complexity analysis is performed for the three best-performing algorithms, and random forest emerges as the most viable model for our envisioned LiHEA.

Volume 9
Pages 138834-138848
DOI 10.1109/access.2021.3118971
Language English
Journal IEEE Access

Full Text