Georgia Tech Researchers Create Wireless Brain-Machine Interface
SEP 28, 2021 3 MIN READ
- Anthony AlfordDevelopment Group Manager at Genesys Cloud ServicesFOLLOW
Researchers from Georgia Tech University’s Center for Human-Centric Interfaces and Engineering have created Soft Scalp Electronics (SSE), a wearable wireless electro-encephalography (EEG) device for reading human brain signals. By processing the EEG data using a neural network, the system allows users wearing the device to control a video game simply by imagining activity.
The research team described their system in a paper published in Advanced Science. Unlike conventional EEG devices, which have rigid electrodes attached to the skin with gel or paste, the SSE uses several microneedle electrodes mounted in a flexible headband, making the device easy to set up and comfortable to wear for long periods. The researchers used flexible substrates for the device’s circuitry, including a wireless interface to a bluetooth controller. The device captures EEG signals generated by motor imagery (MI), where the device wearer imagines performing some physical activity, such as moving their hands or feet; the signals are processed by a convolutional neural network (CNN) and used to control a virtual reality (VR) video game. According to lead researcher Woon-Hong Yeo:
The major limitation [of SSE] is that we are measuring signals on the skin, through the skull, through the tissues, so I believe we have to continuously improve our device quality to get better signals. And at the same time, we have to also continuously improve our data analysis…to have a better accuracy rate.
The goal of many brain-machine interface (BMI) researchers is to enable disabled users to control devices using only brain signals. Achieving the best signal-to-noise ratio (SNR) for this purpose requires physical implants into the users’ brains; this of course has the drawback of requiring brain surgery, leading many scientists to pursue non-invasive techniques such as EEG. These systems conventionally use rigid electronics, with electrodes attached to the user’s skin with conductive gels and dangling wires leading away, resulting in sensing artifacts due to motion of the electrodes relative to the brain.
By contrast, the SSE sensors use microneedles which penetrate through the dry and dead skin cells of the scalp, and a flexible structure which conforms to the wearer’s head, reducing relative motion and achieving a higher SNR compared to conventional electrodes. The microneedles are grouped into six sets of approximately 6mmx6mm square, providing a higher spatial resolution than conventional electrodes.
To use the SSE device as a controller, the researchers trained a CNN model to classify the signals recorded by the sensor while users imagined performing an action: opening and closing a hand or moving a foot. The model achieved 93% accuracy on test data, outperforming similar systems described in previous research. The team used the system as a controller for a “rhythm-type video game” where players must perform specified tasks within a time limit to score points; the test users achieved nearly the highest possible score, “only missing a few points per 5 min game session” with minimal mental effort.
In a discussion about the research on Hacker News, one user pointed out the limitations of BMIs, while noting:
Fortunately you don’t need a brain-reading device to produce something useful, just like you don’t need a teraflop computer to go to the moon. I’ve written recently about an EEG helmet that can be used by profoundly disabled folks to navigate a UI, type, and so on, and that doesn’t require a precise signal at all. So I think what you’ll find is while the Musks of the world are chasing a sci-fi dream of what they think the technology ought to be, most of the utility will come out of using what it’s actually capable of in a smart and compassionate way.
In addition to academic researchers, several tech companies are investigating BMI technology, although many are opting for implanted sensors. In 2019, InfoQ covered Facebook’s system that uses implants to decode electrical signals from patients’ brains into a text representation of the speech sounds that the patients hear and speak. In 2020, researchers from Stanford published an article in Nature describing an implant-based BMI that can convert imagined handwriting motion into computer text. Several of the Stanford team members have collaborated with Facebook as well as with Elon Musk’s BMI startup Neuralink.