Earlier this month, an interdisciplinary team of scholars got together and brainstormed about what robotics and artificial intelligence will do to humanity, and the degree to which the things it will do will help or harm us. The panel, titled Our Robot Future: The Moral, Ethical, and Legal Challenges of Ubiquitous Robotic Systems, asked some unsettling questions. Among them:
What strikes me as remarkable about all of these questions is how long we’ve been hashing some of them out. There are few unique ethical or legal issues raised by drone surveillance that were not already raised by satellite, CCTV, or helicopter surveillance, for example; people used to get much more emotionally attached to their equipment before the industrial revolution, when humans relied on horses, oxen, and so forth to do the jobs machines do today; anti-corruption and anti-organized crime statutes have already been written to deal with criminals who insulate themselves from prosecution by giving ambiguous orders to others; and so on.
It isn’t that AI and robotics don’t represent new applications of the principles we’ve already been hashing out for centuries, in other words; it’s that they raise the stakes of the decisions we make based on these principles. As the world fills up with semi-autonomous pieces of machinery operating based on conclusions we’ve reached, it’s going to become increasingly important that those conclusions are sound. And we have no way of knowing whether they are until we’ve arrived, by which time the human cost of our being wrong may be much higher than we can bear.
This is why some scholars actually see artificial intelligence and robotics as potentially representing an existential threat. And make no mistake: they do. But in some respects they represent a very old existential threat, namely that humanity’s ability to effect change has always outrun our ability to effect change in a responsible way. Humanity tends to make mistakes, suffer for them, learn from them, and adapt. The more powerful technology becomes, the riskier that process may be.