Quantum versus conventional computing: a closer race than you think

by

A room of computer servers

A new method using conventional computing can reduce simulation time from 600 million years to months, challenging a claim of ‘quantum advantage’.

Achieving ‘quantum advantage’ – where a quantum computer can achieve something even the world’s fastest conventional supercomputers can’t on a reasonable timescale – is an important landmark on the journey toward creating a useful quantum computer. 

Quantum computing is of course the holy grail, but it is often easy to lose sight of the importance of the computational resources we currently have that can help us along the way. Dr Raj Patel

Researchers at the University of Bristol, Imperial College London, the University of Oxford, and Hewlett-Packard Enterprises are keeping pace with quantum advantage by developing new methods that can reduce simulation time on conventional computers by a speedup factor of around one billion.

Quantum computers promise exponential speedups for certain problems, with potential applications in areas from drug discovery to new materials for batteries. But quantum computing is still in its early stages, so these are long-term goals.

The new research, published today in the journal Science Advances, challenges a previous claim of quantum advantage by improving a method of conventional computing, vastly speeding it up.

Claiming quantum advantage

The study follows an experimental paper from the University of Science and Technology of China (USTC) that was the first to claim quantum advantage using photons – particles of light.

In USTC's experiment, they generated a large and highly complex quantum state of light and measured it using single-photon detectors in a protocol called ‘Gaussian Boson Sampling’ (GBS). Their paper claimed that the experiment, performed in 200 seconds, would take 600 million years to simulate on the world's largest supercomputer. 

The new study reveals that updated methods of simulating GBS can reduce the predicted simulation time of 600 million years down to just a few months, a speedup factor of around one billion. 

Joint first author Dr Bryn Bell, previously of the Department of Physics at Imperial and now Senior Quantum Engineer at Oxford Quantum Circuits, said: “As researchers develop larger scale experiments, they will look to make claims of quantum advantage relative to classical simulations. Our results will provide an essential point of comparison by which to establish the computational power of future GBS experiments.”

The value of current computational resources

Co-author Dr Raj Patel, from the Department of Physics at Imperial and the University of Oxford said: “Our work with the University of Bristol and Hewlett-Packard Enterprises emphasises the need to continue developing simulation methods that run on ‘classical’ hardware.

"Quantum computing is of course the holy grail, but it is often easy to lose sight of the importance of the computational resources we currently have that can help us along the way.

“Using these resources to find the boundary at which quantum advantage can be obtained is not only of academic interest but is crucial in instilling confidence in potential stakeholders in emerging quantum technologies.”

The team’s methods do not exploit any errors in the experiment and so one next step for the research is to combine their new methods with techniques that exploit the imperfections of the real-world experiment. This would further speed up simulation time and build a greater understanding of which areas require improvements.

Computational complexity

Joint first author Jake Bulmer, a PhD student at the University of Bristol, said: “The USTC estimate used the best-known simulation methods known at the time, but we were confident significant improvements could be made. By asking ourselves, what is it about this experiment which makes it complex, we could uncover understanding for how to simulate it in the most efficient way.

“In essence, our methods reveal which parts of GBS experiments are important and which are not when it comes to designing new generations of devices. For example, we show that if the photon detectors are improved, this could substantially increase the complexity of the experiment.

“These simulated experiments represent a tremendous achievement of physics and engineering. As a researcher, it is truly exciting to contribute to the understanding of where the computational complexity of these experiments arises. We were pretty thrilled with the magnitude of the improvements we achieved - it is not often that you can claim to find a one-billion-fold improvement!”

-

The boundary for quantum advantage in Gaussian boson sampling’ by Jacob F. F. Bulmer et al. is published in Science Advances.

Based on a press release by the University of Bristol.

Supporters

Reporter

Hayley Dunning

Hayley Dunning
Communications Division

Click to expand or contract

Contact details

Tel: +44 (0)20 7594 2412
Email: h.dunning@imperial.ac.uk

Show all stories by this author

Tags:

REF, Global-challenges-Data, Research, Quantum, Big-data
See more tags

Leave a comment

Your comment may be published, displaying your name as you provide it, unless you request otherwise. Your contact details will never be published.