Algorithms increasingly work together to help automate our digital lives—but not every result is perfect, positive or even predicted by their creators. Now, researchers are wondering if the revival of Medieval law could help work out who pays up when things go wrong.
In medieval England, personal property became a deodand if it was judged responsible for the death of a human being, and, as such, was forfeit to the monarch. Its owner was ordered to pay a fine equal to the object’s value to the court. Everything from haystacks to pigs and horses were defined as deodands. The practice was revived in the 1830s to hold railway companies to account for train deaths, but paying a fine equal to the value of an expensive train every time someone died in a crash proved unworkable. Crawford argues that the deodand was killed off by corporate capitalism’s ability to shape its own legal accountability. She says we must be wary of allowing technology companies to use unseeable complexity as a reason to wash their hands when things go awry.
In much the same way as pigs and haystacks and whatever else Medieval courts branded deodands, algorithms could, Crawford argues, be treated in the same way: when an algorithm screws something up for someone, the value of the algorithm would be stumped up by its owner. Valuing an algorithm could, of course, prove tricky. Reckon it could work? [New Scientist]
Image by Simon Evans under Creative Commons license