Who’s liable if robots run amok?

STANFORD (US)—As machines manage more everyday tasks, a group of scholars is thinking about the legal challenges that may arise.

“I worry that in the absence of some good, up-front thought about the question of liability, we’ll have some high-profile cases that will turn the public against robots or chill innovation and make it less likely for engineers to go into the field and less likely for capital to flow in the area,” says M. Ryan Calo, a residential fellow at Stanford Law School’s Center for Internet and Society.

Calo and his colleagues are among the first in the country to ponder the potential legal questions facing the emerging field of personal robotics. And the issues go beyond claims of personal injury and property damage.

One consequence of a flood of lawsuits, he says, is that the United States will fall behind other countries—like Japan and South Korea—that are also at the forefront of personal robot technology, a field that some analysts expect to exceed $5 billion in annual sales by 2015.

“We’re going to need to think about how to immunize manufacturers from lawsuits in appropriate circumstances,” Calo says.

Defense contractors are usually shielded from liability when the robots and machines they make for the military accidentally injure a soldier, he says.

“If we don’t do that, we’re going to move too slowly in development,” Calo predicts. “When something goes wrong, people are going to go after the deep pockets of the manufacturer.”

In order to navigate, robots are outfitted with cameras and sensors. And because they run on computer software, they’re susceptible to hacking. So a robot designed to clean your house could potentially be turned into a spy, vandal, or thief.

And some predict that at some point, someone will sue for the right to marry their robot.

“Don’t laugh,” Paul Saffo, a technology forecaster and visiting scholar at Stanford’s Media X project, said during a recent panel discussion at Stanford to address the legal challenges surrounding robotics. “People get emotionally attached to their robots.”

Saffo says about two-thirds of people who own Roombas—robotic vacuum cleaners made by the Massachusetts company iRobot—have given names to their machines. And several people take them on vacation and treat them like friends or family members.

Some soldiers in Iraq and Afghanistan have reportedly developed unusual bonds with the robots they’ve used to detect roadside bombs, adds Kenneth Anderson, a research fellow at the Hoover Institution.

“Soldiers come back and talk about their IED-detector robots in a way that [shows] they’ve developed deep relationships,” he notes. “They’ll risk their lives so the robot doesn’t get shot.”

While most robots don’t bear strong physical resemblance to humans, they are increasingly being built to think like them.  Stanford researchers are building personal robots that can make their own way in the world.

Mounted on the bottom half of a Segway scooter, the metal frame of STAIR—Stanford Artificial Intelligence Robot—is stacked with cameras, sensors, and wires. From the center of its “body,” a single arm with pincers at its end can extend to grasp and lift items, push buttons, and move things out of the way.

Depending on how it’s programmed, STAIR can move through a hallway to call an elevator, fetch a stapler from a desk in another room, or unload a dishwasher.

“One of the things robotics researchers often think about is how to design robots that are safe and can help us in our homes so that we can even trust them around our children,” explains Andrew Ng, an associate professor of computer science who helped design and build STAIR.

The researchers predict that before autonomous robots become commonplace in the home and office, they will be used to run MRI scanners, subway systems, and city traffic lights.

“That’s the scary part,” says Saffo. “I predict some company will go bankrupt because of a little bot that goes out of control. We’re not heading into a nirvana, and it’s not going to be hell. But along the way, humans will get killed and cars will go out of control.”

Stanford University news: