We don't know why deep learning forms of neural networks achieve great success on many tasks; the discipline has a paucity of theory to explain its empirical successes. As Facebook's Yann LeCun has said, deep learning is like the steam engine, which preceded the underlying theory of thermodynamics by many years. 

It would be the harbinger of an entirely new medium of calculation, harnessing the powers of subatomic particles to obliterate the barriers of time in solving incalculable problems.

But some deep thinkers have been plugging away at the matter of theory for several years now. 

On Wednesday, the group presented a proof of deep learning's superior ability to simulate the computations involved in quantum computing. According to these thinkers, the redundancy of information that happens in two of the most successful neural network types, convolutional neural nets, or CNNs, and recurrent neural networks, or RNNs, makes all the difference.

To read more, click here.