Watch a movie backwards and you'll likely get confused -- but a quantum computer wouldn't. That's the conclusion of researcher Mile Gu at the Centre for Quantum Technologies (CQT) at the National University of Singapore and Nanyang Technological University and collaborators.

In research published 18 July in Physical Review X, the international team show that a quantum computer is less in thrall to the arrow of time than a classical computer. In some cases, it's as if the quantum computer doesn't need to distinguish between cause and effect at all.

The new work is inspired by an influential discovery made almost ten years ago by complexity scientists James Crutchfield and John Mahoney at the University of California, Davis. They showed that many statistical data sequences will have a built-in arrow of time. An observer who sees the data played from beginning to end, like the frames of a movie, can model what comes next using only a modest amount of memory about what occurred before. An observer who tries to model the system in reverse has a much harder task -- potentially needing to track orders of magnitude more information.

To read more, click here.