Plug-and-play brain–computer interface

Posted by

Brain–computer interfaces (BCIs) offer the potential for people with severe motor disabilities to control external assistive devices with their mind. Current BCI systems are limited, however, by the need for daily recalibration of the decoder that converts neural activity into control signals. Researchers at the UC San Francisco Weill Institute for Neurosciences have now employed machine learning to help an individual with tetraplegia learn to control a computer cursor using their brain activity, without requiring extensive daily retraining.

To record neural activity, the team used a 128-channel electrocorticography (ECoG) implant, a pad of electrodes that’s surgically placed on the surface of the brain. Already approved for seizure monitoring in epilepsy patients, ECoG arrays provide greater long-term stability than the pincushion-style arrays of sharp electrodes used in previous BCI studies, which are more sensitive but tend to move or lose signal over time.


The researchers developed a BCI algorithm that uses machine learning (based on an adaptative Kalman filter) to map the neural activity recorded by the ECoG electrodes to the user’s desired cursor movements. Over a period of roughly six months (two months after ECoG implantation), they conducted experiments to assess two approaches for adaptation of this decoder algorithm as the user learns control.


FiveWordsForTheFuture - Oct 3, 2020 | Biodesign, Body, Health, Interfaces, Medicine, Wearables
Tagged | , , ,