GEORGIA TECH (US) — If a robot touched your arm, would you like it or would you feel a little uncomfortable? New research suggests your reaction depends on how you perceive the robot’s intentions.
Researchers at the Georgia Institute of Technology found people generally had a positive response toward being touched by a robotic nurse, as long as the robot was trying to provide medical care, rather than comfort. The research was presented at the Human-Robot Interaction conference in Lausanne, Switzerland.
“What we found was that how people perceived the intent of the robot was really important to how they responded. So, even though the robot touched people in the same way, if people thought the robot was doing that to clean them, versus doing that to comfort them, it made a significant difference in the way they responded and whether they found that contact favorable or not,” says Charlie Kemp, assistant professor of biomedical engineering at Georgia Tech and Emory University.
In the study, researchers looked at how people responded when a robotic nurse, known as Cody, touched and wiped a person’s forearm. Although Cody touched the subjects in exactly the same way, they reacted more positively when they believed Cody intended to clean their arm versus when they believed Cody intended to comfort them.
These results echo similar studies done with nurses.
“There have been studies of nurses and they’ve looked at how people respond to physical contact with nurses,” says Kemp. “And they found that, in general, if people interpreted the touch of the nurse as being instrumental, as being important to the task, then people were OK with it. But if people interpreted the touch as being to provide comfort . . . people were not so comfortable with that.”
In addition, Kemp and his research team tested whether people responded more favorably when the robot verbally indicated that it was about to touch them versus touching them without saying anything beforehand.
“The results suggest that people preferred when the robot did not actually give them the warning,” says Tiffany Chen, a doctoral student at Georgia Tech. “We think this might be because they were startled when the robot started speaking, but the results are generally inconclusive.
Since many useful tasks require that a robot touch a person, the team believes that future research should investigate ways to make robot touch more acceptable to people, especially in healthcare. Many important healthcare tasks, such as wound dressing and assisting with hygiene, would require a robotic nurse to touch the patient’s body.
“If we want robots to be successful in healthcare, we’re going to need to think about how do we make those robots communicate their intention and how do people interpret the intentions of the robot,” adds Kemp. “And I think people haven’t been as focused on that until now. Primarily people have been focused on how can we make the robot safe, how can we make it do its task effectively. But that’s not going to be enough if we actually want these robots out there helping people in the real world.”
Another team from Georgia Tech presented research at the conference on how to teach robots to move like humans.
They found that when robots move in a more human-like fashion, with one movement leading into the next, that people can not only better recognize what the robot is doing, but they can also better mimic it themselves.
“It’s important to build robots that meet people’s social expectations because we think that will make it easier for people to understand how to approach them and how to interact with them,” says Andrea Thomaz, an assistant professor at Georgia Tech’s College of Computing.
“Robot motion is typically characterized by jerky movements, with a lot of stops and starts, unlike human movement which is more fluid and dynamic,” says Ph.D. student Michael Gielniak. “We want humans to interact with robots just as they might interact with other humans, so that it’s intuitive.”
Using a series of human movements taken in a motion-capture lab, they programmed the robot, Simon, to perform the movements. They also optimized that motion to allow for more joints to move at the same time and for the movements to flow into each other in an attempt to be more human-like. They asked their human subjects to watch Simon and identify the movements he made.
“When the motion was more human-like, human beings were able to watch the motion and perceive what the robot was doing more easily,” says Gielniak.
In addition, they tested the algorithm they used to create the optimized motion by asking humans to perform the movements they saw Simon making. The thinking was that if the movement created by the algorithm was indeed more human-like, then the subjects should have an easier time mimicking it. Turns out they did.
“We found that this optimization we do to create more life-like motion allows people to identify the motion more easily and mimic it more exactly,” says Thomaz.
Teaching Simon to move like a human is one thing, but teaching the robot to use subtle cues to attract a human’s attention is the focus of a project led by professor Aaron Bobick. His team found that they can program a robot to understand when it gains a human’s attention and when it falls short.
They wanted to see if Simon could tell when he had successfully attracted the attention of a human who was busily engaged in a task and when he had not.
“Simon would make some form of a gesture, or some form of an action when the user was present, and the computer vision task was to try to determine whether or not you had captured the attention of the human being,” says Bobick.
With close to 80 percent accuracy Simon was able to tell, using only his cameras as a guide, whether someone was paying attention to him or ignoring him.
“We would like to bring robots into the human world. That means they have to engage with human beings, and human beings have an expectation of being engaged in a way similar to the way other human beings would engage with them,” adds Bobick.
More news from Georgia Tech: www.digitallounge.gatech.edu/