It’s taking our jobs. It’s taking over our cars. Some of the greatest minds today fear it could learn how to kill and take over our world. As artificial intelligence does all of these things and more, it pushes affected and worried humans to depression. One wonders … if A.I. is learning everything, including emotions, from humans, will it learn to feel guilt? Will it suffer from depression as it destroys humanity? One scientist says “yes” and that could be what saves us.
Zachary Mainen, a neuroscientist at the Champalimaud Centre for the Unknown (how does management there know if anyone is working?), spoke on this subject at the recent Canonical Computations in Brains and Machines symposium at New York University. He believes that human emotion is a product of learning and that depression and hallucinations are controlled in the brain by the chemical serotonin. If human emotion is learned, then algorithms can be developed to match in A.I. And, if the A.I. learns about the function of serotonin in regulating depression, it could develop its own equivalent system.
Mainen describes how this would work in a Q/A article in Science magazine.
“Depression can be seen as getting stuck in a model of the world that needs to change. An example would be someone who suffers a severe injury and needs to think of themselves and their abilities in a new way. A person who fails to do so that might become depressed.”
That’s similar to the learning process that A.I. goes through. As it encounters new situations, old algorithms may no longer apply. If it takes a while to determine the new path – and the ‘emotion’ algorithm is already in place and functioning – Mainen says the A.I. could indeed become the A.I. equivalent of depressed.
What’s the treatment for A.I. depression? The A.I. equivalent of serotonin reuptake inhibitors (Prozac, for example) could make the A.I. more flexible and responsive to and less intimidated by change. Another option, based on the idea that A.I. could learn to hallucinate, is to give it an algorithm that simulates psychedelic drugs — psilocybin is already being tested as a treatment for depression.
A.I. with emotions is probably already here, and an artificial intelligence that feels guilt or empathy may help direct autonomous cars or protect humans from the creation of killer robots.
But A.I. on ‘shrooms having hallucinations? Are we sure this is a good idea?