Archives

Play Video

Listening gives robots human-like touch

(Credit: Getty Images)

Researchers have given robots a sense of touch via “listening” to vibrations, allowing them to identify materials, understand shapes, and recognize objects just like human hands

Imagine sitting in a dark movie theater wondering just how much soda is left in your oversized cup. Rather than prying off the cap and looking, you pick up and shake the cup a bit to hear how much ice is inside rattling around, giving you a decent indication of if you’ll need to get a free refill.

Setting the drink back down, you wonder absent-mindedly if the armrest is made of real wood. After giving it a few taps and hearing a hollow echo however, you decide it must be made from plastic.

This ability to interpret the world through acoustic vibrations emanating from an object is something we do without thinking. And it’s an ability that researchers are on the cusp of bringing to robots to augment their rapidly growing set of sensing abilities.

Set to be published at the Conference on Robot Learning (CoRL 2024) being held November 6–9 in Munich, Germany, new research from Duke University details a system dubbed SonicSense that allows robots to interact with their surroundings in ways previously limited to humans.

A robot "hand" with four "fingers," each with a microphone for sensing.
The ability to feel acoustic vibrations through tactile interactions gives this robotic hand a human-like sense of touch to better perceive the world. (Credit: Duke)

“Robots today mostly rely on vision to interpret the world,” explains Jiaxun Liu, lead author of the paper and a first-year PhD student in the laboratory of Boyuan Chen, professor of mechanical engineering and materials science at Duke.

“We wanted to create a solution that could work with complex and diverse objects found on a daily basis, giving robots a much richer ability to ‘feel’ and understand the world.”

SonicSense features a robotic hand with four fingers, each equipped with a contact microphone embedded in the fingertip. These sensors detect and record vibrations generated when the robot taps, grasps, or shakes an object. And because the microphones are in contact with the object, it allows the robot to tune out ambient noises.

Based on the interactions and detected signals, SonicSense extracts frequency features and uses its previous knowledge, paired with recent advancements in AI, to figure out what material the object is made out of and its 3D shape. If it’s an object the system has never seen before, it might take 20 different interactions for the system to come to a conclusion. But if it’s an object already in its database, it can correctly identify it in as little as four.

“SonicSense gives robots a new way to hear and feel, much like humans, which can transform how current robots perceive and interact with objects,” says Chen, who also has appointments and students from electrical and computer engineering and computer science. “While vision is essential, sound adds layers of information that can reveal things the eye might miss.”

In the paper and demonstrations, Chen and his laboratory showcase a number of capabilities enabled by SonicSense. By turning or shaking a box filled with dice, it can count the number held within as well as their shape. By doing the same with a bottle of water, it can tell how much liquid is contained inside. And by tapping around the outside of an object, much like how humans explore objects in the dark, it can build a 3D reconstruction of the object’s shape and determine what material it’s made from.

While SonicSense is not the first attempt to use this approach, it goes further and performs better than previous work by using four fingers instead of one, touch-based microphones that tune out ambient noise and advanced AI techniques. This setup allows the system to identify objects composed of more than one material with complex geometries, transparent or reflective surfaces, and materials that are challenging for vision-based systems.

“While most datasets are collected in controlled lab settings or with human intervention, we needed our robot to interact with objects independently in an open lab environment,” says Liu. “It’s difficult to replicate that level of complexity in simulations. This gap between controlled and real-world data is critical, and SonicSense bridges that by enabling robots to interact directly with the diverse, messy realities of the physical world.”

Assistant Professor of Mechanical Engineering & Materials Science and Computer Science at Duke University
These abilities make SonicSense a robust foundation for training robots to perceive objects in dynamic, unstructured environments. So does its cost; using the same contact microphones that musicians use to record sound from guitars, 3D printing and other commercially available components keeps the construction costs to just over $200.

Moving forward, the group is working to enhance the system’s ability to interact with multiple objects. By integrating object-tracking algorithms, robots will be able to handle dynamic, cluttered environments — bringing them closer to human-like adaptability in real-world tasks.

Another key development lies in the design of the robot hand itself. “This is only the beginning. In the future, we envision SonicSense being used in more advanced robotic hands with dexterous manipulation skills, allowing robots to perform tasks that require a nuanced sense of touch,” Chen says. “We’re excited to explore how this technology can be further developed to integrate multiple sensory modalities, such as pressure and temperature, for even more complex interactions.”

Support for the work came from the Army Research laboratory STRONG program and DARPA’s FoundSci program and TIAMAT.

Source: Duke University

Play Video

Better fake muscles give robot fish real kick

Artificial muscles in action under water. (Credit: Gravert et al./Science Advances)

Researchers have developed artificial muscles for robot motion.

Their solution offers several advantages over previous technologies: it can be used wherever robots need to be soft rather than rigid or where they need more sensitivity when interacting with their environment.

Many roboticists dream of building robots that are not just a combination of metal or other hard materials and motors but also softer and more adaptable. Soft robots could interact with their environment in a completely different way; for example, they could cushion impacts the way human limbs do, or grasp an object delicately.

This would also offer benefits regarding energy consumption: robot motion today usually requires a lot of energy to maintain a position, whereas soft systems could store energy well, too. So, what could be more obvious than to take the human muscle as a model and attempt to recreate it?

Biological inspiration

The functioning of artificial muscles is based on biology. Like their natural counterparts, artificial muscles contract in response to an electrical impulse. However, the artificial muscles consist not of cells and fibers but of a pouch filled with a liquid (usually oil), the shell of which is partially covered in electrodes.

When these electrodes receive an electrical voltage, they draw together and push the liquid into the rest of the pouch, which flexes and is thus capable of lifting a weight. A single pouch is analogous to a short bundle of muscle fibers; several of these can be connected to form a complete propulsion element, which is also referred to as an actuator or simply as an artificial muscle.

The idea of developing artificial muscles is not new, but until now, there has been a major obstacle to realizing it: electrostatic actuators worked only with extremely high voltages of around 6,000 to 10,000 volts. This requirement had several ramifications: for instance, the muscles had to be connected to large, heavy voltage amplifiers; they did not work in water; and they weren’t entirely safe for humans.

Robert Katzschmann, a robotics professor at ETH Zurich, together with Stephan-Daniel Gravert, Elia Varini, and other colleagues have now developed a new solution.

The HALVE of it

Gravert, who works as a scientific assistant in Katzschmann’s lab, has designed a shell for the pouch. The researchers call the new artificial muscles HALVE actuators, where HALVE stands for “hydraulically amplified low-voltage electrostatic”.

“In other actuators, the electrodes are on the outside of the shell. In ours, the shell consists of different layers,” says Gravert.

“We took a high-permittivity ferroelectric material, i.e., one that can store relatively large amounts of electrical energy, and combined it with a layer of electrodes. Next, we coated it with a polymer shell that has excellent mechanical properties and makes the pouch more stable,” Gravert explains.

This meant the researchers could reduce the required voltage, because the much higher permittivity of the ferroelectric material allows large forces despite low voltage. Not only did Gravert and Varini develop the shell for the HALVE actuators together, but they also built the actuators themselves in the lab to use in two robots.

One of these robotic examples is an 11-centimeter-tall gripper (about 4.3 inches tall) with two fingers. Each finger is moved by three series-connected pouches of the HALVE actuator. A small battery-operated power supply provides the robot with 900 volts. Together, the battery and power supply weigh just 15 grams (about 0.5 ounces). The entire gripper, including the power and control electronics, weighs 45 grams (about 1.58 ounces). The gripper can grip a smooth plastic object firmly enough to support its own weight when the object is lifted into the air with a cord.

“This example excellently demonstrates how small, light and efficient the HALVE actuators are. It also means that we’ve taken a huge step closer to our goal of creating integrated muscle-operated systems,” Katzschmann says with satisfaction.

Diving into the future

The second object is a fish-like swimmer, almost 30 centimeters long (about 11.8 inches), that can move smoothly through the water. It consists of a “head” containing the electronics and a flexible “body” to which the HALVE actuators are attached. These actuators move alternately in a rhythm that produces the swimming motion. The autonomous fish can go from a standstill to a speed of three centimeters per second in 14 seconds—and that’s in normal tap water.

This second example is important because it demonstrates another new feature of the HALVE actuators: as the electrodes no longer sit unprotected outside the shell, the artificial muscles are now waterproof and can also be used in conductive liquids.

“The fish illustrates a general advantage of these actuators—the electrodes are protected from the environment and, conversely, the environment is protected from the electrodes. So, you can operate these electrostatic actuators in water or touch them, for example,” Katzschmann explains.

And the layered structure of the pouches has another advantage: the new actuators are much more robust than other artificial muscles.

Ideally, the pouches should be able to achieve a great deal of motion and do it quickly. However, even the smallest production error, such as a speck of dust between the electrodes, can lead to an electrical breakdown—a kind of mini lightning strike.

“When this happened in earlier models, the electrode would burn, creating a hole in the shell. This allowed the liquid to escape and rendered the actuator useless,” Gravert says. This problem is solved in the HALVE actuators because a single hole essentially closes itself due to the protective plastic outer layer. As a result, the pouch usually remains fully functional even after an electrical breakdown.

The two researchers are clearly delighted to have taken the development of artificial muscles a decisive step forward, but they are also realistic.

As Katzschmann says, “Now we have to ready this technology for larger-scale production, and we can’t do that here in the ETH lab. Without giving too much away, I can say that we’re already registering interest from companies that would like to work with us.”

For example, artificial muscles could one day be used in novel robots, prostheses, or wearables; in other words, in technologies that are worn on the human body.

The research appears in Science Advances.

Source: ETH Zurich

Play Video

Team prints robotic hand with bones, ligaments, tendons

A robotic hand made of varyingly rigid and elastic polymers. (Credit: Thomas Buchner/ETH Zurich)

Researchers have, for the first time, printed a robotic hand with bones, ligaments, and tendons made of different polymers using a new laser scanning technique.

3D printing is advancing rapidly, and the range of materials that can be used has expanded considerably. While the technology was previously limited to fast-curing plastics, it is now suitable for slow-curing plastics as well. These have decisive advantages as they have enhanced elastic properties and are more durable and robust.

The use of such polymers is made possible by a new technology that allows researchers to 3D print complex, more durable robots from a variety of high-quality materials in one go. This new technology also makes it easy to combine soft, elastic, and rigid materials. The researchers can also use it to create delicate structures and parts with cavities.

“We wouldn’t have been able to make this hand with the fast-curing polyacrylates we’ve been using in 3D printing so far,” says Thomas Buchner, a doctoral student in the group of ETH Zurich robotics professor Robert Katzschmann and first author of the study published in Nature.

“We’re now using slow-curing thiolene polymers. These have very good elastic properties and return to their original state much faster after bending than polyacrylates.”

This makes thiolene polymers ideal for producing the elastic ligaments of the robotic hand. In addition, the stiffness of thiolenes can be fine-tuned very well to meet the requirements of soft robots.

“Robots made of soft materials, such as the hand we developed, have advantages over conventional robots made of metal. Because they’re soft, there is less risk of injury when they work with humans, and they are better suited to handling fragile goods,” Katzschmann says.

3D printers typically produce objects layer by layer: nozzles deposit a given material in viscous form at each point; a UV lamp then cures each layer immediately. Previous methods involved a device that scraped off surface irregularities after each curing step. This works only with fast-curing polyacrylates. Slow-curing polymers such as thiolenes and epoxies would gum up the scraper.

To accommodate the use of slow-curing polymers, the researchers developed 3D printing further by adding a 3D laser scanner that immediately checks each printed layer for any surface irregularities.

“A feedback mechanism compensates for these irregularities when printing the next layer by calculating any necessary adjustments to the amount of material to be printed in real time and with pinpoint accuracy,” says coauthor Wojciech Matusik, a professor at Massachusetts Institute of Technology.

This means that instead of smoothing out uneven layers, the new technology simply takes the unevenness into account when printing the next layer.

Inkbit, an MIT spin-off, was responsible for developing the new printing technology. The ETH Zurich researchers developed several robotic applications and helped optimize the printing technology for use with slow-curing polymers.

At ETH Zurich, Katzschmann’s group will use the technology to explore further possibilities and to design even more sophisticated structures and develop additional applications. Inkbit is planning to use the new technology to offer a 3D printing service to its customers and to sell the new printers.

Source: ETH Zurich