Twelve years ago, a car wreck took away Nathan Copeland’s ability to control his hands or sense what his fingers were touching.
A few months ago, researchers at the University of Pittsburgh and the University of Pittsburgh Medical Center gave Copeland a new way to reach out and feel the world around him. It’s a mind-controlled robotic arm that has pressure sensors in each fingertip that send signals directly to Copeland’s brain.
The scientists published details of their work online Thursday in the journal Science Translational Medicine.
“It’s a really weird sensation,” Copeland, now 30, says in a video made shortly after he first tried the system. “Sometimes it feels, kind of, like electrical and sometimes it’s more of a pressure.” But he also describes many of the sensations coming from his robotic hand as “natural.”
When Copeland touches an object with the robotic hand, he can tell which finger the sensation is coming from and whether an object feels hard or soft, says Robert Gaunt, a bioengineer and assistant professor in the Department of Physical Medicine & Rehabilitation at the University of Pittsburgh.
“But we’re really not at the point where we could, say, get him to feel the difference between silk and burlap,” Gaunt says.
The success represents an advance that is “absolutely critical in terms of making prosthetics useful,” says Mike McLoughlin, an engineer at the Johns Hopkins University Applied Physics Laboratory.
McLoughlin is part of a team at Hopkins that developed the Modular Prosthetic Limb that Copeland is using. The research at both Hopkins and in Pittsburgh is supported by the government’s Defense Advanced Research Projects Agency.
For several years now, people have been able to control robotic arms using thoughts alone. But they have relied entirely on vision to know whether the arm is going in the right direction or grasping an object with the proper amount of force.
That makes it very challenging to perform simple tasks like grasping a foam coffee cup without crushing it, McLoughlin says.
“Without sensory feedback, somebody would have to actually have to look at the prosthetic, look at the cup, start to close the hand, (and) visually see the cup is starting to deform,” he says.
Restoring Copeland’s sense of touch was a painstaking process. But the Pittsburgh team knew it was possible.
“His hand has been disconnected from his brain because of his spinal cord injury,” Gaunt says. “But the brain hasn’t lost its ability to feel.”
So the team began looking for a way to send touch sensations directly to Copeland’s brain. The first step was to monitor his brain activity using a technique called magnetoencephalography.
“We were able to see the parts of his brain that became active when he was watching videos of a hand being touched,” Gaunt says.
Next, the researchers placed tiny electrodes in Copeland’s brain that could stimulate the areas corresponding to each finger. Then they waited for the brain to heal, as it adjusted to the presence of the electrodes.
It was several weeks before the team was able to send the first tiny pulse of electricity to Copeland’s brain. “When it finally happened, he just very calmly said, ‘Yep, I felt it on my index finger,’ ” Gaunt recalls. “But in the background I was breathing a sigh of relief and other people were cheering.”
Of course, mind-controlled robots are still years away from consumer applications, McLoughlin says. At the moment, they are still too expensive, too bulky and too finicky to be used outside a laboratory setting. And there’s no good way to control them without implanting electrodes in the brain.
Still, the ability to receive touch sensation from a robotic arm has the potential to help not only thousands of people who are paralyzed, but also people with a wide range of physical disabilities, McLoughlin says. For example robots that provide sensory feedback could eventually help a disabled person cook a meal or clean up things at home.
“We’re on the verge of something here that’s going to transform lives,” he says.