A team of researchers from Carnegie Mellon University, working with staff at the University of Minnesota, has made a breakthrough that could benefit paralyzed patients and those with movement disorders.
Using a noninvasive (not implanted) brain-computer interface (BCI), engineers have developed the first successful mind-controlled robotic arm that can continuously track and follow a computer cursor.
BCIs have been shown to achieve good performance for controlling robotic devices using only the signals sensed from brain implants. If robotic devices can be precisely controlled, they could handle a variety of daily tasks for people with disabilities.
Until now, however, BCIs successful in continuously controlling robotic arms have relied on implanted brain devices. These devices require a substantial amount of medical and surgical expertise to correctly install and operate, not to mention the high cost and potential risks to subjects. Thus, their use has been restricted to just a few clinical cases.
A grand challenge in BCI research is to develop less invasive or even totally noninvasive ways to let paralyzed patients control their environment or robotic limbs using their own brain and thoughts as the controller and control signals.
Watch as biomedical engineers at Carnegie Mellon University work with the first- successful non-invasive mind-controlled robotic arm to continuously track a computer cursor.
However, BCIs that use noninvasive external sensing, rather than brain implants, receive “dirtier” signals, leading to less-precise control. When using only the brain to control a robotic arm, a noninvasive BCI doesn’t stand up to implanted devices. Despite this, BCI researchers have tried to develop less-invasive or noninvasive ways to help patients.
“Advances in neural decoding and the practical use of noninvasive robotic-arm control will have major implications on the eventual development of noninvasive neurorobotics,” notes Bin He, the head of the university’s Biomedical Engineering Dept.
Using novel sensing and machine learning techniques, he and his lab can access signals deep within the brain, giving them a high resolution of control over a robotic arm. With noninvasive neuroimaging and a novel continuous pursuit paradigm, he is overcoming the noisy EEG signals. This, in turn, is leading to better EEG-based neural decoding, and improving real-time continuous 2D robotic device control.
For the first time, the team is using noninvasive BCI to control a robotic arm that tracks a cursor on a computer screen, and the arm can follow the cursor continuously. In the previous attempts, robotic arms controlled by humans noninvasively had followed moving cursors in jerky, discrete motions, as though the arm was trying to “catch up” to the brain’s commands. Now the arm follows the cursor in a smooth, continuous path.
The team also improved the “brain” and “computer” components of BCI by increasing user engagement and training, and raising the spatial resolution of noninvasive neural data through EEG source imaging. These approaches enhanced BCI learning by nearly 60% for traditional center-out tasks, and improved continuous cursor tracking by more than 500%.
The technology also has applications that could help a variety of people, by offering safe, noninvasive “mind control” of devices that can allow people to interact with and control their environments. The technology has, to date, been tested in 68 able-bodied human subjects (up to 10 sessions for each subject), including virtual device control and controlling of a robotic arm for continuous pursuit. The technology is directly applicable to patients, and the team plans to conduct clinical trials in the near future.
“Despite technical challenges using noninvasive signals, we are fully committed to bringing this safe and economic technology to people who can benefit from it,” He said. “This work represents an important step in noninvasive brain-computer interfaces, a technology that someday may become a pervasive assistive technology aiding everyone, like smartphones.”