From devices that help stroke victims regain function of their limbs to novelty cat ears you move with your mind, the promise and wonder of brain-computer interface (BCI) technology are quickly becoming a reality. Despite the rapid progress, however, the performance of BCI technology continues to be limited by the large amounts of high-quality brain wave data needed to train classification algorithms, which decode the real-world data into a format that computers can use.
One issue is that tasks employed to collect brain wave data—typically measured using electroencephalography (EEG)—are often unrealistic. “Many BCI experiments rely on controlled conditions in which subjects are instructed to fully focus on the main task,” said Kai Keng Ang, a Senior Scientist at A*STAR’s Institute for Infocomm Research (I2R). “However, this is different from what normally happens in real-life situations where various internal and external factors can make it difficult to stay focused on the task.”
EEG data also vary from subject to subject and session to session, making it impractical to measure enough high-quality data from human subjects and difficult to generate artificial data using conventional models, Ang added.
In a new study, Ang collaborated with corresponding author Cuntai Guan of Nanyang Technological University to address these issues, designing a new framework to generate artificial EEG data that can be used to augment real training data for classification. Based on a type of neural network called deep convolutional generative adversarial network (DCGAN), the framework is trained on EEG data measured from subjects performing a task to detect movement intention, either while being completely focused (to simulate controlled conditions) or distracted (to resemble real-life scenarios).
In addition to real training data, the DCGAN-based framework also learns from subject-specific variables, which enables subject-specific artificial EEG data to be generated. “This will significantly reduce the calibration time when tailoring a BCI system to a new user,” Ang explained.
The researchers, including study first author Fatemeh Fahimi, who was previously a postdoctoral researcher at I2R, also generated artificial EEG data using two benchmark methods for comparison. Compared to real EEG data alone, artificially augmented EEG data was able to produce more accurate classification results, especially under the real-life, distracted scenario.
“The improvement in accuracy suggests that with effective artificial EEG data generation, we can achieve high performance without undergoing a long calibration session to obtain more EEG data,” Ang concluded.
The A*STAR-affiliated researchers contributing to this research are from the Institute for Infocomm Research (I2R).