Time is weird. It's a concept notoriously impossible define without circular logic. The operating definition of time in physics is simply "what a clock reads." That's it. Yet, we have ideas about time that are held as fundamental truths. The arrow of time is one, that's the idea that time only moves in one direction. Simple enough right? Here's another one: cause and effect, thing's cause things to happen. Also pretty simple and just kind of self evident. Causality is one of the first concepts that infants learn, and it's baked into everything we do. But for how basic and integral to our whole reality time is, we don't really understand it or have a better definition than "what a clock reads." Sometimes, discoveries are made that seem to fly in the face of our supposed understanding of time. That's what happened this week, when a paper titled Cause and Effect in a Quantum World was published in the journal Physical Review X. According to the paper, quantum computers can ignore cause and effect when modeling systems, a discovery which may fundamentally alter our understanding and relationship with time itself.
OK, so two things need to be addressed before this makes sense. Just kidding, we're talking about time and quantum physics. It's not going to make sense, but we're going to try anyway.
The first is cause and effect. Due to our ability to understand cause and effect we can predict things and model possible futures. If you see a person walking around with their shoes untied, you might make a prediction that they're going to fall on their face. You could very well be right. Untied shoelaces cause a person to trip which causes the person to fall on their face. Computers can make these predictions too. However, it doesn't work the other way. It's exponentially harder and requires a great deal more memory and processing power, for both humans and computers, to model systems based on information interpreted in the wrong order. If you're given the statement: "a person has fallen on their face" it's a lot harder to figure out why. Another way to say it is: it's easier to understand a movie if you watch it from beginning to end than if you watch it in reverse. This is called causal asymmetry and it seems like a basic idea.
The second is that the physical laws of nature don't actually require time to move in one direction. We just sort of assume the arrow of time moves in one direction does because of our experience and perception. Addressing the question of why it takes so much more processing power and memory to figure out a cause from an effect than it does the other way around, paper co-author Mile Gu asks:
"When the physics does not impose any direction on time, where does causal asymmetry -- the memory overhead needed to reverse cause and effect -- come from?"
To try to answer that question, scientists decided to force quantum computers to model systems backwards, essentially making them watch a movie in reverse and figure out how it all fits together. What they found is pretty ridiculous: quantum computers entirely ignore causal asymmetry when reading data in reverse. Cause and effect has no bearing on their ability to figure out systems. In fact, quantum computers model systems in reverse-time more efficiently than classical computers model systems in forward-time. Jayne Thompson, one of the co-authors of the paper explains the "profound implications" of this discovery:
"The most exciting thing for us is the possible connection with the arrow of time. If causal asymmetry is only found in classical models, it suggests our perception of cause and effect, and thus time, can emerge from enforcing a classical explanation on events in a fundamentally quantum world."
So basically, we keep getting stumped by our need to make the universe conform to our preconceived notions of how things work, and time is still weird.
Previous article