View more articles about


Interactive robot guided by sensors—not remote

BROWN (US)— A computer scientist at Brown University has created a robot that can chat. It also can gesture, follow movements, and back up in a natural way—without the use of remote control devices.

Chad Jenkins, assistant professor of computer science, and his team are using emerging sensor technology to create more natural interactions with robots. Jenkins directs Robotics, Learning, and Autonomy, or RLAB, at Brown.

“We need robots that can adapt over time, that can respond to human commands and interact with humans,” says Jenkins.

This implies that robot behavior must be increasingly determined by user input, a problem a number of RLAB projects are addressing. Among them is a robot soccer experiment in which people use a Nintendo Wii remote, or Wiimote, to participate in the game from the robot’s perspective.
“The player sees what the robot sees,” says Jenkins, “and decides what it should do in a given situation. The person knows what he

wants the robot to do, yet the robot’s control policy—the entity that makes decisions for it—may not be capable of reflecting that,” Jenkins says.

When input from humans playing the game is gathered and used to refine the robot control policy, what the player wants can in turn decide what the robot will do. The user can help the robot build on its primitive locomotion and manipulation skills to perform higher-level tasks.

The goal of making robots more closely reflect the will and behavior of humans is what is behind another RLAB project to refine movement in a NASA humanoid upper-body robot.

Researchers use motion capture systems to record human movement in three dimensions, then translate that data into digital models that can be used to create a more effective control policy. The new policy has made it possible for the robot to replicate basic human motion and manipulate objects.

Jenkins also is producing interfaces that could work with a neural cursor control developed by Brown professor of neuroscience John Donoghue. The multidisciplinary research team Donoghue leads has demonstrated that an implant in the primary motor cortex of a paralyzed human subject can convert thought into a control signal that, when decoded, will produce movement in a robotic hand or a jointed robotic arm.

Jenkins is refining the concept to make robotics perform in this situation with much more dexterity, so they are not just grasping with power, for example, but grasping with precision and appropriate pressure for a given task. People with paralysis will be able to use the hand to cook and perform other household activities.

“Here again, we use motion capture to look at how all the joints in the human hand rotate and interact,” Jenkins explains. “Then we map those motions, compressing the data as fully as possible into a two-dimensional cursor control.”

Brown University news:

Related Articles