Who Is Responsible If A Robot Kills Someone?

We may earn a commission from links on this page.

When a person kills someone, the law is pretty clear as to responsibility. But what happens if an autonomous machine is responsible for someone's death? The British Royal Academy of Engineering thinks that it's time someone looked into that.

The RAE has recently published a report entitled "Autonomous Systems: Social, Legal and Ethical Issues" that deals with such issues, as well as raising questions about when autonomous systems could be considered artificial intelligence:

[A]re autonomous systems different from other complex controlled systems? Should they be regarded either as 'robotic people' - in which case they might be blamed for faults; or machines - in which case accidents would be just like accidents due to other kinds of mechanical failure.


The report comes about, in part, due to the lack of current legal guidelines around these areas, according to lawyer and Imperial College professor Chris Elliott:

It's a very difficult area for the law because the idea that a machine might be responsible for something is not an easy concept at all... If you take an autonomous system and one day it does something wrong and it kills somebody, who is responsible? Is it the guy who designed it? What's actually out in the field isn't what he designed because it has learned throughout its life. Is it the person who trained it? If we can't resolve all these things about who's responsible, who's charged if there's an accident and also who should have stopped it, we deny ourselves the benefit of using this stuff.


The report doesn't come to any specific conclusions other than "We should really start dealing with this stuff," although it takes some interesting detours along the way ("Most young people only encounter [robots and AI] in computer games where the goal is to destroy them!" being one of the more amusing). Nonetheless, it's interesting to see the issues being raised in the first place, and the British legal system struggling to deal with issues of the 21st Century, for a change.

If an autonomous machine kills someone, who is responsible? [Guardian.co.uk]