Archives

Play Video

Laser beams power soft robotic arm for complex tasks

(Credit: Jeff Fitlow/Rice)

Researchers have developed a soft robotic arm capable of performing complex tasks such as navigating around an obstacle or hitting a ball, guided and powered remotely by laser beams without any onboard electronics or wiring.

The research could inform new ways to control implantable surgical devices or industrial machines that need to handle delicate objects.

“Our material bends toward laser light like a flower stem does toward sunlight.”

In a proof-of-concept study that integrates smart materials, machine learning, and an optical control system, a team of Rice University researchers led by materials scientist Hanyu Zhu used a light-patterning device to precisely induce motion in a robotic arm made from azobenzene liquid crystal elastomer—a type of polymer that responds to light.

According to the study in Advanced Intelligent Systems, the new robotic system incorporates a neural network trained to predict the exact light pattern needed to create specific arm movements. This makes it easier for the robot to execute complex tasks without needing similarly complex input from an operator.

“This was the first demonstration of real-time, reconfigurable, automated control over a light-responsive material for a soft robotic arm,” says Elizabeth Blackert, a doctoral alumna who is the first author on the study.

Conventional robots typically involve rigid structures with mobile elements like hinges, wheels, or grippers to enable a predefined, relatively constrained range of motion. Soft robots have opened up new areas of application in contexts like medicine, where safely interacting with delicate objects is required. So-called continuum robots are a type of soft robot that forgoes mobility constraints, enabling adaptive motion with a vastly expanded degree of freedom.

“A major challenge in using soft materials for robots is they are either tethered or have very simple, predetermined functionality,” says Zhu, assistant professor of materials science and nanoengineering.

“Building remotely and arbitrarily programmable soft robots requires a unique blend of expertise involving materials development, optical system design, and machine learning capabilities. Our research team was uniquely suited to take on this interdisciplinary work.”

The team created a new variation of an elastomer that shrinks under blue laser light then relaxes and regrows in the dark—a feature known as fast relaxation time that makes real-time control possible. Unlike other light-sensitive materials that require harmful ultraviolet light or take minutes to reset, this one works with safer, longer wavelengths and responds within seconds.

“When we shine a laser on one side of the material, the shrinking causes the material to bend in that direction,” Blackert says. “Our material bends toward laser light like a flower stem does toward sunlight.”

To control the material, the researchers used a spatial light modulator to split a single laser beam into multiple beamlets, each directed to a different part of the robotic arm. The beamlets can be turned on or off and adjusted in intensity, allowing the arm to bend or contract at any given point, much like the tentacles of an octopus. This technique can in principle create a robot with virtually infinite degrees of freedom—far beyond the capabilities of traditional robots with fixed joints.

“What is new here is using the light pattern to achieve complex changes in shape,” says Rafael Verduzco, professor and associate chair of chemical and biomolecular engineering and professor of materials science and nanoengineering.

“In prior work, the material itself was patterned or programmed to change shape in one way, but here the material can change in multiple ways, depending on the laser beamlet pattern.”

To train such a multiparameter arm, the team ran a small number of combinations of light settings and recorded how the robot arm deformed in each case, using the data to train a convolutional neural network—a type of artificial intelligence used in image recognition. The model was then able to output the exact light pattern needed to create a desired shape such as flexing or a reach-around motion.

The current prototype is flat and moves in 2D, but future versions could bend in three dimensions with additional sensors and cameras.

“This is a step towards having safer, more capable robotics for various applications ranging from implantable biomedical devices to industrial robots that handle soft goods,” Blackert says.

Support for the research came from the National Science Foundation, the Welch Foundation, and the US Army Research Office. All opinions expressed in this press release are the authors’ and do not necessarily reflect the policies and views of the funding entities.

Source: Rice University

Play Video

Watch: Prosthetic robot hand ‘knows’ what it’s touching

(Credit: Johns Hopkins)

Engineers have developed a pioneering prosthetic hand that can grip plush toys, water bottles, and other everyday objects like a human.

The hand carefully conforms and adjusts its grasp to avoid damaging or mishandling whatever it holds.

The system’s hybrid design is a first for robotic hands, which have typically been too rigid or too soft to replicate a human’s touch when handling objects of varying textures and materials.

The innovation offers a promising solution for people with hand loss and could improve how robotic arms interact with their environment.

Details about the device appear in Science Advances.

“The goal from the beginning has been to create a prosthetic hand that we model based on the human hand’s physical and sensing capabilities—a more natural prosthetic that functions and feels like a lost limb,” says Sriramana Sankar, a Johns Hopkins University PhD student in biomedical engineering who led the work.

“We want to give people with upper-limb loss the ability to safely and freely interact with their environment, to feel and hold their loved ones without concern of hurting them.”

The device, developed by the same Neuroengineering and Biomedical Instrumentations Lab that in 2018 created the world’s first electronic “skin” with a humanlike sense of pain, features a multifinger system with rubberlike polymers and a rigid 3D-printed internal skeleton. Its three layers of tactile sensors, inspired by the layers of human skin, allow it to grasp and distinguish objects of various shapes and surface textures, rather than just detect touch. Each of its soft air-filled finger joints can be controlled with the forearm’s muscles, and machine learning algorithms focus the signals from the artificial touch receptors to create a realistic sense of touch, Sankar says.

“The sensory information from its fingers is translated into the language of nerves to provide naturalistic sensory feedback through electrical nerve stimulation,” Sankar says.

In the lab, the hand identified and manipulated 15 everyday objects, including delicate stuffed toys, dish sponges, and cardboard boxes, as well as pineapples, metal water bottles, and other sturdier items. In the experiments, the device achieved the best performance compared with the alternatives, successfully handling objects with 99.69% accuracy and adjusting its grip as needed to prevent mishaps. The best example was when it nimbly picked up a thin, fragile plastic cup filled with water, using only three fingers without denting it.

“We’re combining the strengths of both rigid and soft robotics to mimic the human hand,” Sankar says. “The human hand isn’t completely rigid or purely soft—it’s a hybrid system, with bones, soft joints, and tissue working together. That’s what we want our prosthetic hand to achieve. This is new territory for robotics and prosthetics, which haven’t fully embraced this hybrid technology before. It’s being able to give a firm handshake or pick up a soft object without fear of crushing it.”

To help amputees regain the ability to feel objects while grasping, prostheses will need three key components: sensors to detect the environment, a system to translate that data into nerve-like signals, and a way to stimulate nerves so the person can feel the sensation, says Nitish Thakor, a Johns Hopkins biomedical engineering professor who directed the work.

The bioinspired technology allows the hand to function this way, using muscle signals from the forearm, like most hand prostheses. These signals bridge the brain and nerves, allowing the hand to flex, release, or react based on its sense of touch. The result is a robotic hand that intuitively “knows” what it’s touching, much like the nervous system does, Thakor says.

“If you’re holding a cup of coffee, how do you know you’re about to drop it? Your palm and fingertips send signals to your brain that the cup is slipping,” Thakor says.

“Our system is neurally inspired—it models the hand’s touch receptors to produce nervelike messages so the prosthetics’ ‘brain,’ or its computer, understands if something is hot or cold, soft or hard, or slipping from the grip.”

While the research is an early breakthrough for hybrid robotic technology that could transform both prosthetics and robotics, more work is needed to refine the system, Thakor says. Future improvements could include stronger grip forces, additional sensors, and industrial-grade materials.

“This hybrid dexterity isn’t just essential for next-generation prostheses,” Thakor says.

“It’s what the robotic hands of the future need because they won’t just be handling large, heavy objects. They’ll need to work with delicate materials such as glass, fabric, or soft toys. That’s why a hybrid robot, designed like the human hand, is so valuable—it combines soft and rigid structures, just like our skin, tissue, and bones.”

Additional authors are from Florida Atlantic University, Johns Hopkins, and the University of Illinois Chicago.

Funding for the research came from the Department of Defense through the Orthotics and Prosthetics Outcomes Research Program and the National Science Foundation.

Source: Johns Hopkins University

  • Robotic gripper is gentle enough to pick up a drop of water