Maximum Overdrive has entered movie legend as one of the worst films ever made. The 1986 science fiction, horror and comedy film imagined a world in which inanimate objects, including bulldozers, chainsaws and electric hairdryers, came to life and started massacring people. Even
But real life came tragically close to imitating fiction during the filming of Maximum Overdrive when a radio-controlled lawnmower ran into the set and badly wounded the director of photography, who lost an eye. He sued
In some respects, the history of this film exemplifies much of the popular debate about automation, robots and artificial intelligence. While we seem to panic about the existential threat such technologies may pose to mankind in the distant future, we are in danger of overlooking some of the more immediate concerns about how to manage our mechanical creations.
Who should take moral, ethical and legal responsibility for the actions of increasingly ubiquitous robots? Should it be the manufacturers, programmers or users? In the longer run, when they acquire higher powers of cognition and perhaps consciousness, should it even be the robots themselves?
...
In his forthcoming book Android Dreams,
Such issues are becoming all the more urgent given the explosive growth in the number of drones, driverless cars and medical, educational and domestic robots whizzing around our skies, streets and homes. While this robot revolution promises to improve the human condition, it also threatens to unleash a disruptive economic force.
"If you want to envisage the future in the 1920s, 1940s, 1980s, or in 2017, then you think of robots. But the reality is that robots have been in our societies since the 1950s," he says.
In a paper called Robots in American Law,
The cases mostly revolved around whether robots could be considered surrogates for people: if they should be deemed "animate" for the purposes of import tariffs; whether they could "perform" as entertainers in a concert hall; and whether an unmanned robot submarine could "possess" a wreck for the purposes of salvage claims.
"Emergence is a property that robots will behave in ways that the system cannot anticipate," says
For example, some high-speed trading algorithms are "learning" from patterns in financial markets and responding in ways that their creators cannot predict, perhaps not even understand. Driverless cars are being developed to respond to events in real time (one hopes) rather than preprogrammed to anticipate every situation on the road.
...
This month, 116 founders of robotics and AI companies signed a petition calling for the outright banning of killer robots - known as lethal autonomous weapons systems, or Laws. The use of such weapons systems crossed a moral red line, they claim. Only humans should be permitted to kill humans.
"We should not lose sight of the fact that, unlike other potential manifestations of AI that still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now," says
However, drawing neat lines between humans and robots in this fast-evolving world is tricky. The latest technologies are blurring the line between people and instruments, making robots agentic, if not necessarily agents. Although today's robots would fail the legal test of mens rea (having intent to commit an offence), they still appear "responsible" for their actions in a layman's sense of the term.
A second big development in robotics, which muddies the picture still further, is the embodiment of AI in physical, sometimes humanoid, form in machines designed to engage directly with people.
"Over the past 10 years we have seen a rise of robots intended to engage directly with people," she says.
...
That has spurred a fast-developing academic field known as human-robot interaction, or HRI. Robotics departments of both universities and companies have been hiring sociologists, anthropologists, lawyers, philosophers and ethicists to inform how these interactions should evolve.
"In a legal and moral sense robots are machines that are programmed by people and designed by people," says
Some of the most striking humanoid robots have been built by
By developing "bio-inspired intelligent algorithms" and allowing them to absorb rich social data, via sophisticated sensors, we can create smarter and faster robots,
He adds: "I want robots to learn to love and what it means to be loved and not just love in the small sense. Yes, we want robots capable of friendship and familial love, of this kind of bonding.
"However, we also want robots to love in a bigger sense, in the sense of the Greek word agape, which means higher love, to learn to value information, social relationships, humanity."
If such "moral machines" can truly be created then that raises a whole host of new questions and challenges. Would the robot or its owner possess the rights to its data? Could robots be said to have their own legal identity? Should they, as
"We have a moral responsibility not to destroy the 'Mona Lisa' because it is a remarkable artefact, or an archive or any object that has immense emotional attachment," he says.
But he argues there are great dangers in anthropomorphising systems of intelligence if that leads to misinterpretations and misunderstandings of the underlying technology. Manufacturers should not try to trick users into believing that robots have more capabilities than they possess. "People should not be fooled into thinking that robots are smarter than they actually are," he says.
"I agree that robots could one day have a consciousness. But they would first have to have the ability to play, to build things and take a chocolate biscuit out of a jar on a shelf," he says.
For the moment, few politicians appear interested in such debates. But a grassroots movement of academics and entrepreneurs is pushing these issues higher up the agenda.
In the US, some academics, such as
This year, members of the
Some powerful
"Nobody likes being regulated, but everything (cars, planes, food, drugs, etc) that's a danger to the public is regulated. AI should be, too," he tweeted.
Copyright The Financial Times Limited 2017