EEG , ,

Brain cap morphs thought into motion

U. MARYLAND (US) — Interface technology that connects mind to machine could soon be used to control computers, robotic prosthetic limbs, motorized wheelchairs and even digital avatars.

“We are on track to develop, test and make available to the public—within the next few years—a safe, reliable, noninvasive brain computer interface that can bring life-changing technology to millions of people whose ability to move has been diminished due to paralysis, stroke, or other injury or illness,” says José “Pepe”  Contreras-Vidal, associate professor of kinesiology at the University of Maryland.



“We are doing something that few previously thought was possible,” says Contreras-Vidal. “We use EEG (electroencephalography) to non-invasively read brain waves and translate them into movement commands for computers and other devices.”

For the study, reported in the Journal of Neurophysiology, Contreras-Vidal successfully used EEG brain signals to reconstruct the complex 3-D movements of the ankle, knee, and hip joints during human treadmill walking. Two earlier studies showed (1) similar results for 3-D hand movement and (2) that subjects wearing the brain cap could control a computer cursor with their thoughts.

The researchers write “that EEG signals can be used to study the cortical dynamics of walking and to develop brain-machine interfaces aimed at restoring human gait function.”

There are other brain computer interface technologies under development, but Contreras-Vidal notes that these competing technologies are either invasive, requiring electrodes to be implanted directly in the brain, or, if noninvasive, require much more training to use..

Contreras-Vidal and his team are collaborating on a rapidly growing cadre of projects with researchers at other institutions to develop thought-controlled robotic prosthetics that can assist victims of injury and stroke.

Their latest partnership with researchers at Rice University, the University of Michigan, and Drexel University will design a prosthetic arm that amputees can control directly with their brains, and which will allow users to feel what their robotic arm touches.

“There’s nothing fictional about this,” says Marcia O’Malley, associate professor of mechanical engineering at Rice. “The investigators on this grant have already demonstrated that much of this is possible. What remains is to bring all of it—non-invasive neural decoding, direct brain control and (touch) sensory feedback—together into one device.”

In a project now underway, Contreras-Vidal and colleagues are pairing the brain cap’s EEG-based technology with a next-generation robotic arm designed by researchers at the Johns Hopkins Applied Physics Laboratory to function like a normal limb.

Also in development is a collaboration with a New Zealand developer of a powered lower-limb exoskeleton called Rex that could be used to restore gait after spinal cord injury.

Researchers see the brain cap technology being used to help stroke victims whose brain injuries affect their motor-sensory control.

“There is a big push in brain science to understand what exercise does in terms of motor learning or motor retraining of the human brain,” says Larry Forrester, associate professor of physical therapy and rehabilitation science at the University of Maryland.

Forrester tracks the neural activity of people on a treadmill doing precise tasks like stepping over dotted lines and matches specific brain activity recorded in real time with exact lower-limb movements.

The data could help stroke victims in several ways, Forrester says. One is a prosthetic device, called an “anklebot,” or ankle robot, that stores data from a normal human gait and assists partially paralyzed people. People who are less mobile commonly suffer from other health issues such as obesity, diabetes or cardiovascular problems, Forrester says, “so we want to get (stroke survivors) up and moving by whatever means possible.”

The second use of the EEG data in stroke victims is more complex, yet offers exciting possibilities.

“By decoding the motion of a normal gait,” Contreras-Vidal says, “we can then try and teach stroke victims to think in certain ways and match their own EEG signals with the normal signals.” This could “retrain” healthy areas of the brain in what is known as neuroplasticity.

One potential method for retraining comes from Steve Graff, a first-year bioengineering doctoral student who envisions a virtual reality game that matches real EEG data with on-screen characters.

“It gives us a way to train someone to think the right thoughts to generate movement from digital avatars. If they can do that, then they can generate thoughts to move a device,” says Graff, who brings a unique personal perspective to the work.

He has congenital muscular dystrophy and uses a motorized wheelchair. The advances he’s working on could allow him to use both hands—to put on a jacket, dial his cell phone or throw a football while operating his chair with his mind.

During the past two decades a great deal of progress has been made in the study of direct brain to computer interfaces, most of it through studies using monkeys with electrodes implanted in their brains. However, for use in humans such an invasive approach poses many problems, not the least of which is that most people don’t’ want holes in their heads and wires attached to their brains, says Contreras-Vidal.

“EEG monitoring of the brain, which has a long, safe history for other applications, has been largely ignored by those working on brain-machine interfaces, because it was thought that the human skull blocked too much of the detailed information on brain activity needed to read thoughts about movement and turn those readings into movement commands for multi-functional high-degree of freedom prosthetics.”

The research is funded in part by the National Science Foundation and National Institutes of Health.

More news from the University of Maryland:

Related Articles