Given the options, would you prefer your hallucinations to be natural or artificially induced? That choice may be available sooner than you think. A British research group has developed a so-called ‘Hallucination Machine’ that pairs virtual reality headsets (no surprise there) with a modified version of Google’s DeepDream algorithm (not Google again – are they involved in everything?) for the alleged noble purpose of studying how the brain differentiates between reality and hallucinations in order to artificially induce them without drugs.
The researchers are already testing the headsets by having college students wear them while walking around campus (the line of volunteers was probably a mile long). What could possibly go wrong?
“We’re hallucinating all the time. It’s just that when we agree about our hallucinations, we call that reality.”
That interesting concept was presented in a TED talk by Professor Anil Seth, who is the co-director of the University of Sussex’s Sackler Centre where the research took place. To develop the VR hallucinations, the group fed panoramic videos of nature scenes into the algorithm. Deep Dream then searched for patterns in the viewing field of the volunteers that it could enhance with the videos. The technique is similar to pareidolia, where the mind takes a fuzzy image and matches it to a known one, such as when a person looks at a piece of burnt toast and sees a picture of Abraham Lincoln.
Does that sound like hallucinations? The results, published in the journal Scientific Reports, say yes and no. In one experiment, 12 volunteers saw patterns become views of their campus and their responses to questions about the images, loss of control and loss of sense of self matched the responses of another group describing the effects of taking psilocybin mushrooms.
In the second experiment, after watching DeepDream visualizations, the participants said they did not experience time distortion, which is a common side effect of psilocybin and other psychedelics.
One of the benefits of this hallucination machine is that it allows the researchers to control what hallucinations a person sees and, since all of them see the same ones, compare how different brains react to them. By accident, many of the panoramic videos used had dogs in them, resulting in a lot of canine hallucinations. Kind of like eating mushrooms before going to a dog show without getting kicked out for barking incessantly (or so I’ve heard).
The researchers say they’re a long way from replicating mushroom or LSD hallucinations, but they’re moving ahead and will get there before long. Would you strap yourself into a hallucination machine and take a walk around a college campus? Your city? A strange place? This requires a tremendous amount of trust in whoever is programming your hallucinations. Would you trust Google? Do you trust Google Maps? Would you consider programming your own hallucinations? Do you really know how to expand your own mind?
If all of the people wearing the hallucination machine have the same hallucination, does it become reality?