New system checks accuracy of quantum chips

January 15, 2020 //By
quantum chips
An international team of researchers from MIT, Google, and elsewhere have designed a system that can verify when quantum chips have accurately performed complex computations that classical computers can’t.

Because quantum chips perform computations using so-called quantum bits (qubits) which can represent multiple states, namely the two classic binary states 0 or 1, as well as an arbitrary quantum superposition of both states simultaneously, they largely expand on compute capacity and are reported to enable quantum computers to solve problems that are impossible for classical computers (on a practical time scale).

Although full-scale quantum computers will require millions of qubits and a lot of progress will have to be made to achieve this scale, researchers have already started to develop “Noisy Intermediate Scale Quantum” (NISQ) chips, which contain around 50 to 100 qubits, just enough to demonstrate “quantum advantage” over classical computers.

Though, the chip’s outputs can look entirely random and it takes a long time to simulate steps to determine if everything went according to plan, making verification very inefficient. This is the problem tackled by the researchers at MIT in a Nature Physics paper co-authored with physicists from the Google Quantum AI Laboratory, Elenion Technologies, Lightmatter, and Zapata Computing. They described a novel protocol to efficiently verify that an NISQ chip has performed all the right quantum operations and validated their protocol on a notoriously difficult quantum problem running on custom quantum photonic chip.

The researchers’ work essentially traces an output quantum state generated by the quantum circuit back to a known input state. Doing so reveals which circuit operations were performed on the input to produce the output. Those operations should always match what researchers programmed. If not, the researchers can use the information to pinpoint where things went wrong on the chip.

At the core of the new protocol, called “Variational Quantum Unsampling,” lies a “divide and conquer” approach, explains first author Jacques Carolan, a postdoc in the Department of Electrical Engineering and Computer Science (EECS) and the Research Laboratory of Electronics (RLE) at MIT. This approach breaks the output quantum state into chunks. “Instead of doing the whole thing in one shot, which takes a very long time, we do this unscrambling layer by layer. This allows us to break the problem up to tackle it in a more efficient way,” Carolan says.

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.