A software tool makes it easy for anybody to quickly design a custom robot—including its movements—and print out its parts with a 3D printer. You assemble the parts like a puzzle. Add electronic motors to the joints, install a control unit and battery, and then unleash your creature.
The first step is to create a basic skeleton for the desired robot, specifying how many extremities the figure will have and how many segments there will be in the backbone. This skeleton can be modified at will by extending or shortening its segments or breaking them up with new joints.
The primary challenge of the research project was to design the robot’s movements so that they would also work outside the digital realm.
“That’s the hard part of this work, the part where technical innovation is needed,” says Bernhard Thomaszewski of Disney Research Zurich. From a user’s perspective, he says, the tools offered by their program are comparable with those used in the animation of purely digital figures.
However, unlike in digital animations, the robots must obey the laws of physics. In particular, physical robots cannot balance in every pose that is digitally possible, and there is a limit to the accelerations that can be produced by the motors.
“Without support from a computer, it is extremely difficult for users to take these restrictions into account when planning the movements, and this quickly becomes frustrating for the layman,” says Thomaszewski. “This is precisely the task that our software automates through simulation and numerical optimization.
“The user can therefore focus entirely on the creative aspects of the design.”
How the robots move
In order to design the motion of a robot, the user specifies simple motion goals such as “walk forward” or “turn left.”
Vittorio Megaro, a doctoral student at ETH Zurich, designed the program to automatically convert these high-level commands into low-level control signals for the motors, allowing the robot to walk stably.
Whenever the user changes the robot’s skeleton or its motion goals, the computer automatically adapts the time-dependent motor values. This process is very fast, offering immediate feedback on the resulting motion, as predicted by simulation.
Once the user is satisfied with the robot, the program automatically generates three-dimensional building plans for all segments of the body and for the connecting parts, which house the electric motors.
Standard sizes of various commercially available motors are stored in the program, which means users only need to select the one that matches to get the connecting parts.
The parts are fabricated on a 3D printer and, finally, the robot is assembled by hand.
Cheap components, expensive printing
The electric motors, cables, battery and control unit for the robot are available commercially, and Megaro was able to buy these components cheaply online. On the other hand, a greater financial burden is associated with manufacturing the robot limbs on a high-quality 3D printer.
Megaro manufactured the first two prototypes using an in-house printer. This was cheap, he says, but the quality of the body parts was not particularly good. Apparently, the shin bones broke in the first prototype, which was a four-legged robotic dog.
He commissioned an outside company to produce his insect-like masterpiece. This, he says, made of sturdy, high-grade plastic. “That quality comes at a price,” says Megaro.
Megaro and his colleagues intentionally kept the design of their robotic creatures simple. They can only adopt gaits that the user has first created using the software.
Megaro’s five-legged robotic insect can move forwards and sideways using various techniques. It cannot, however, identify obstacles—the robots don’t have sensors and aren’t designed to travel independently. They also cannot be controlled remotely, something that could potentially be achieved using a smartphone app.
“It also wasn’t the project’s aim to create an autonomous robot,” Megaro points out.
The software is still in development and not available to the public yet. Researchers at Carnegie Mellon University collaborated on the project.
Source: ETH Zurich