Quantum computing could make complex calculations trivial in the future, but right now it's fraught with problems. Consider one of them solved, though, in the shape of a new quantum error correction technique.

One of the many problems exhibited by the breed of future computers is that they exist in the delicate and fuzzy word quantum world, using not bits but qubits—quantum bits. Each of these qubits can represent a 0, a 1, or—crucially—something in between, providing the ability to dramatically bump up computation speeds.

But as Schroedinger was keen to point out, quantum systems need to be isolated from the rest of world in order to work: interactions with the external world cause the system to decohere, collapsing down and taking a binary state, just like a normal, slow computer. The internal workings of a quantum computer also introduce decoherence effects, too—which in turn brings errors. As a result, scientists have to decide on a tolerable error rate and design for that. The problem is that to achieve an error rate small enough to benefit from the quantum computer, you need an awful lot of qubits, which are expensive to manufacture.

Now, though, a group of researchers led by John Martinis—who now works at Google but used to based at the University of California, Santa Barbara—has developed a chip which can detect at least one kind of quantum error introduced during calculations. The system, explained in a new Nature paper, links together nine qubits so that they can each monitor one another for "bit flips", where a qubit assumes a state it isn't meant to because of decoherence—with a qubit flipping from 1 to 0, say. PhysOrg describes nicely how it works:

It uses parity information—the measurement of change from the original data (if any)—as opposed to the duplication of the original information that is part of the process of error detection in classical computing. That way, the actual original information that is being preserved in the qubits remains unobserved... in something akin to a Sudoku puzzle, the parity values of data qubits in a qubit array are taken by adjacent measurement qubits, which essentially assess the information in the data qubits by measuring around them.

That means that the device can acquire enough information about the information in the qubits without directly observing them—which would itself lock the qubit into a single state, destroying the super-fast computational ability.

The new technique can accurately detect bit flips, and also stop them propagating through a calculation—which means they don't contaminate an ongoing computational process. It's an important development, effectively removing one of the biggest error headaches that currently faces quantum computing. Indeed, Technology Review reports that "experts in the field say it is an important step toward a fully functional quantum computer."

There are still plenty more problems to solve, of course. But for now, we're at least one step closer to the quantum computer we lust for. [Nature via Technology Review]

*Image by Julian Kelly*