Researchers are using AI to finetune a robotic prosthesis to improve manual dexterity by finding right balance between human and machine control.
Whether you’re reaching for a mug, a pencil, or someone’s hand, you don’t need to consciously instruct each of your fingers on where they need to go to get a proper grip.
The loss of that intrinsic ability is one of the many challenges people with prosthetic arms and hands face. Even with the most advanced robotic prostheses, these everyday activities come with an added cognitive burden as users purposefully open and close their fingers around a target.
Researchers at the University of Utah are now using artificial intelligence to solve this problem. By integrating proximity and pressure sensors into a commercial bionic hand and then training an artificial neural network on grasping postures, the researchers developed an autonomous approach that is more like the natural, intuitive way we grip objects. When working in tandem with the artificial intelligence, study participants demonstrated greater grip security, greater grip precision, and less mental effort.
Critically, the participants were able to perform numerous everyday tasks, such as picking up small objects and raising a cup, using different gripping styles, all without extensive training or practice.
The study was led by engineering professor Jacob A. George and Marshall Trout, a postdoctoral researcher in the Utah NeuroRobotics Lab, and appears in the journal Nature Communications.
“As lifelike as bionic arms are becoming, controlling them is still not easy or intuitive,” Trout says. “Nearly half of all users will abandon their prosthesis, often citing their poor controls and cognitive burden.”
One problem is that most commercial bionic arms and hands have no way of replicating the sense of touch that normally gives us intuitive, reflexive ways of grasping objects. Dexterity is not simply a matter of sensory feedback, however. We also have subconscious models in our brains that simulate and anticipate hand-object interactions; a “smart” hand would also need to learn these automatic responses over time.
The Utah researchers addressed the first problem by outfitting an artificial hand, manufactured by TASKA Prosthetics, with custom fingertips. In addition to detecting pressure, these fingertips were equipped with optical proximity sensors designed to replicate the finest sense of touch. The fingers could detect an effectively weightless cotton ball being dropped on them, for example.
For the second problem, they trained an artificial neural network model on the proximity data so that the fingers would naturally move to the exact distance necessary to form a perfect grasp of the object. Because each finger has its own sensor and can “see” in front of it, each digit works in parallel to form a perfect, stable grasp across any object.
But one problem still remained. What if the user didn’t intend to grasp the object in that exact manner? What if, for example, they wanted to open their hand to drop the object? To address this final piece of the puzzle, the researchers created a bioinspired approach that involves sharing control between the user and the AI agent. The success of the approach relied on finding the right balance between human and machine control.
“What we don’t want is the user fighting the machine for control. In contrast, here the machine improved the precision of the user while also making the tasks easier,” Trout says. “In essence, the machine augmented their natural control so that they could complete tasks without having to think about them.”
The researchers also conducted studies with four participants whose amputations fell between the elbow and wrist. In addition to improved performance on standardized tasks, they also attempted multiple everyday activities that required fine motor control. Simple tasks, like drinking from a plastic cup, can be incredibly difficult for an amputee; squeeze too soft and you’ll drop it, but squeeze too hard and you’ll break it.
“By adding some artificial intelligence, we were able to offload this aspect of grasping to the prosthesis itself,” George says. “The end result is more intuitive and more dexterous control, which allows simple tasks to be simple again.”
“The study team is also exploring implanted neural interfaces that allow individuals to control prostheses with their mind and even get a sense of touch coming back from this,” George says. “Next steps, the team plans to blend these technologies, so that their enhanced sensors can improve tactile function and the intelligent prosthesis can blend seamlessly with thought-based control.”
Additional coauthors are from the University of Utah and the University of Colorado, Boulder
Funding came from the National Institutes of Health and the National Science Foundation.
Source: University of Utah