Highlights

Above

Machine learning, applied to the interpretation of brain wave patterns, could result in better brain-machine interfaces.

© Pixabay

Decoding brain waves with machine learning

16 Oct 2019

A machine learning technique developed by A*STAR scientists could help patients with locked-in syndrome communicate via brain-machine interfaces.

The idea of connecting the human brain to a computer may sound like the stuff of science fiction, but the technology to enable it is already being developed in laboratories today. If successfully deployed, brain-machine interfaces (BCIs) could restore the quality of life for patients with disabilities such as amyotrophic lateral sclerosis (ALS).

A neurodegenerative disease, ALS robs patients of their ability to move or speak, even as full cognition and consciousness are retained. At the Institute of Infocomm Research (I2R), A*STAR, scientists are working towards creating BCIs that could help ALS patients communicate with their loved ones using the patients' brain waves alone. The research is carried out in collaboration with the National Neuroscience Institute.

“The effectiveness of control and rehabilitation for patients who have lost motor function relies on the classification accuracy of motor imagery,” said Tao Yang, Research Scientist at I2R, explaining that motor imagery refers to brain wave patterns associated with certain movements imagined by the patient. For instance, the intention to produce a movement to the right generates brain wave patterns distinct from those representing a movement to the left. A computer must be able to distinguish the two and execute this movement faithfully.

“Poor classification accuracy leads to incorrect feedback to the users, limiting the application of motor imagery-based BCIs,” Yang said.

The group hypothesized that machine learning could be used to better interpret motor imagery data. They developed a convolutional neural network—so called because it mimics the way the human brain processes information—and applied it to recordings of brain wave patterns of a patient with locked-in syndrome, a condition which can result from ALS. The recordings were taken during an activity which required the patient to mentally move a cursor in a specified direction.

“Using our technique, we discovered that brain wave patterns occurring at 4 Hz and lower—known as the Delta band in electroencephalography—enhanced the classification accuracy of motor imagery analysis to above 60 percent on average,” said Yang. Delta band signals are commonly overlooked in motor imagery analysis owing to their association with sleep, he added.

While these initial results are promising, Yang acknowledges that a lot more work is needed before their method can be applied to motor imagery-based BCIs. “More data sets are required to test the proposed convolutional neural network architecture,” he pointed out. “Combining the image-based representation approach with a long short-term memory (LSTM) technique would also allow us to better discover the spatial and temporal aspects of electroencephalography signals.”

The A*STAR-affiliated researchers contributing to this research are from the Institute of Infocomm Research (I2R).

Want to stay up to date with breakthroughs from A*STAR? Follow us on Twitter and LinkedIn!

References

Yang, T., Selvaratnam, T., Ang, K.K., Phua, K.S., Toh, V., et al. Image-based Motor Imagery EEG Classification using Convolutional Neural Network. IEEE EMBS International Conference on Biomedical & Health Informatics (2019) | article

About the Researcher

Tao Yang

Research Scientist

Institute for Infocomm Research
Tao Yang received his PhD degree in robotics from the National University of Singapore. He is currently a Research Scientist at A*STAR’s Institute for Infocomm Research (I2R). His research interests include artificial intelligence, robotics in healthcare applications and brain-computer interfaces.

This article was made for A*STAR Research by Wildtype Media Group