robot_globe_1

Who’s liable if robots run amok?

STANFORD (US)—As machines manage more everyday tasks, a group of scholars is thinking about the legal challenges that may arise.

“I worry that in the absence of some good, up-front thought about the question of liability, we’ll have some high-profile cases that will turn the public against robots or chill innovation and make it less likely for engineers to go into the field and less likely for capital to flow in the area,” says M. Ryan Calo, a residential fellow at Stanford Law School’s Center for Internet and Society.

Calo and his colleagues are among the first in the country to ponder the potential legal questions facing the emerging field of personal robotics. And the issues go beyond claims of personal injury and property damage.

One consequence of a flood of lawsuits, he says, is that the United States will fall behind other countries—like Japan and South Korea—that are also at the forefront of personal robot technology, a field that some analysts expect to exceed $5 billion in annual sales by 2015.

“We’re going to need to think about how to immunize manufacturers from lawsuits in appropriate circumstances,” Calo says.

Defense contractors are usually shielded from liability when the robots and machines they make for the military accidentally injure a soldier, he says.

“If we don’t do that, we’re going to move too slowly in development,” Calo predicts. “When something goes wrong, people are going to go after the deep pockets of the manufacturer.”

In order to navigate, robots are outfitted with cameras and sensors. And because they run on computer software, they’re susceptible to hacking. So a robot designed to clean your house could potentially be turned into a spy, vandal, or thief.

And some predict that at some point, someone will sue for the right to marry their robot.

“Don’t laugh,” Paul Saffo, a technology forecaster and visiting scholar at Stanford’s Media X project, said during a recent panel discussion at Stanford to address the legal challenges surrounding robotics. “People get emotionally attached to their robots.”

Saffo says about two-thirds of people who own Roombas—robotic vacuum cleaners made by the Massachusetts company iRobot—have given names to their machines. And several people take them on vacation and treat them like friends or family members.

Some soldiers in Iraq and Afghanistan have reportedly developed unusual bonds with the robots they’ve used to detect roadside bombs, adds Kenneth Anderson, a research fellow at the Hoover Institution.

“Soldiers come back and talk about their IED-detector robots in a way that [shows] they’ve developed deep relationships,” he notes. “They’ll risk their lives so the robot doesn’t get shot.”

While most robots don’t bear strong physical resemblance to humans, they are increasingly being built to think like them.  Stanford researchers are building personal robots that can make their own way in the world.

Mounted on the bottom half of a Segway scooter, the metal frame of STAIR—Stanford Artificial Intelligence Robot—is stacked with cameras, sensors, and wires. From the center of its “body,” a single arm with pincers at its end can extend to grasp and lift items, push buttons, and move things out of the way.

Depending on how it’s programmed, STAIR can move through a hallway to call an elevator, fetch a stapler from a desk in another room, or unload a dishwasher.

“One of the things robotics researchers often think about is how to design robots that are safe and can help us in our homes so that we can even trust them around our children,” explains Andrew Ng, an associate professor of computer science who helped design and build STAIR.

The researchers predict that before autonomous robots become commonplace in the home and office, they will be used to run MRI scanners, subway systems, and city traffic lights.

“That’s the scary part,” says Saffo. “I predict some company will go bankrupt because of a little bot that goes out of control. We’re not heading into a nirvana, and it’s not going to be hell. But along the way, humans will get killed and cars will go out of control.”

Stanford University news: http://news.stanford.edu/news/

chat9 Comments

You are free to share this article under the Creative Commons Attribution-NoDerivs 3.0 Unported license.

9 Comments

  1. Bill Martin

    Ever heard of Isaac Asimov and the 3 Laws of Robotics?

    “1 A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    2 A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
    3 A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.”

  2. Robert

    Uh, Bill? You need to check the alignment of your warp field containment coils- the three laws are fiction, not reality. If they were real, Microsoft would’ve been out of business a long time ago. Besides, there’s a so-called zeroeth law that trumps the big three. Try to keep current with the Foundation’s field service bulletins, eh? :)

  3. Tom Wood

    Yeah, but the movie ‘Network’ was fiction too until FOX picked it up and used it as a playbook.

  4. NewEnglandBob

    Require a psych evaluation before allowing ownership of robots. People who are anthropomorphizing them should not be allowed near them until they grow up.

  5. dralf

    “…People who are anthropomorphizing them should not be allowed near them until they grow up.”?
    Who’s kidding who? People name, talk to, decorate, and otherwise anthropomorphize their cars, guns and damn near any other type of personal property, why should robots be any different? Heard the newest Dodge Ram commercial yet?

  6. Bill Clinton

    Yes-No coming to class. colleges just sell Admit ticket to class. yes-come to class. no- cannot to class.

    Yes-No. true-false. Albert Eistein, Isaac Newton, Charles Darwin all published their works on paper for world. see.

    Digital text, Digital publishing allow billions billions billions worldwide the results of research, may or may Not be paid with tax-money.

    Knowledge belongs to the world. Knowledge belongs to All people of the earth.

  7. Stephen W. O'Driscoll

    Even though the Three Laws of Robotics are fiction they are a good starting point to establish legal perameters. There are certainly multiple loopholes, as Isaac pointed out, but we have to have some basic framework to start with.

  8. Johnny Wellington

    I agree. The Three Laws may be fiction, but so were intelligent computers (e.g. Star Wars, Lost in Space), ray guns (e.g, The War of the Worlds), and communicators (e.g. Star Trek), but look now: computers are smart, we’re working on beam weapons and we have cell phones (some even chirp like the ST communicators). Sometimes fiction can be the forefront of innovation. Isaac Asimov thought this through when he wrote his books & may have laid the groundwork for the ethics of intelligent machines. H. G. Wells coined the term “aliens” in reference to extraterrestrial entities. It’s a commonplace term now AND a cultural phenomenon for some.

    Open your mind. Ooooo. O.o

  9. Johnny

    i think it would be a great idea to develop robots that can be used in daily life as long as they can’t evolve into something questionably dangerous. And the three laws of robotics are from the movie I-robot and it would be a great place to start. But at the same time i think i would like to meet a robot that could think for itself. It would be very intresting

We respect your privacy.