Google Unveils Largest Quantum Computer Yet, but So What?

Research scientist Marissa Giustina at Google’s Quantum AI Lab in Santa Barbara, California.
Research scientist Marissa Giustina at Google’s Quantum AI Lab in Santa Barbara, California.
Photo: Google

Google announced its newest 72-qubit quantum computer, called Bristlecone, at a conference and in a blog post yesterday. That’s a big step over the competition—but how big a deal is it?

Advertisement

Quantum computing, or computing based on the principles of physics’ most head-scratching topic, has entered a new era in which it’s doing things that are classically hard. Some researchers are trying to demonstrate that their quantum computers can solve problems that supercomputers can’t. Google thinks that Bristlecone will be the chip that reaches this “quantum supremacy” milestone.

“We are cautiously optimistic that quantum supremacy can be achieved with Bristlecone,” Google research scientist Julian Kelly wrote in a blog post, “and feel that learning to build and operate devices at this level of performance is an exciting challenge.”

Advertisement

Computers perform calculations using bits, which are physical systems that assume one of two choices. We usually call these choices “zero” and “one.” Qubits, or quantum bits, also have zeroes and ones, but exist and interact with one another based on the rules of quantum mechanics. They take on “zero” and “one” simultaneously with different strengths (technically it’s a linear combination of zero and one with complex constants serving as “probability amplitudes”) while they’re calculating. Performing these calculations require entangling the qubits, essentially making their output reliant on one another, which causes certain combinations of outcomes to become more or less likely.

This new system could have important potential uses in breaking current cryptography strategies or optimizing searching in the long term. But in the shorter term, they could potentially be useful for things like modeling complex molecules better than classical computers, finding optimal solutions to complicated problems, and improving artificial intelligence.

Google has already discussed with Gizmodo how it plans to achieve quantum supremacy. This would be a specifically tailored problem that quantum computers can accomplish that supercomputers are believed not to be able to complete in a reasonable amount of time. “We believe the experimental demonstration of a quantum processor outperforming a supercomputer would be a watershed moment for our field, and remains one of our key objectives,” said Google’s blog post.

Number of qubits is important to quantum computers, of course—the leaders have around 50, and 72 would be the largest yet. But many folks at other organizations have long been pointing out that qubit count isn’t all that matters.

Advertisement

“The name of the game is not just adding more qubits,” Bob Sutor, vice president of Cognitive, Blockchain, and Quantum Solutions at IBM Research told Gizmodo and others at a press event last week. “It’s the quality of the qubits that count. Having 50 great qubits is far superior to having 2,000 really lousy qubits.” That means that the qubits must stay quantum, and not degrade into regular bits, for a long time.

Google is aware of all these points, of course, and mentions them in the blog post, on which error correction plays a central topic. “Crucially, the processor must also have low error rates on readout and logical operations, such as single and two-qubit gates,” Kelly wrote. But just fabricating the device isn’t everything. They actually need to test the qubits and show results to the scientific community in order to convince scientists.

Advertisement

If Google does achieve quantum supremacy with Bristlecone, that’s just one of many important milestones in quantum computing, and would potentially be a task most useful for benchmarking the computer’s performance. As John Preskill said recently in a paper about the new “NISQ” era of quantum computing we’ve entered, “Quantum supremacy is a worthy goal, notable for entrepreneurs and investors not so much because of its intrinsic importance but rather as a sign of progress toward more valuable applications further down the road.”

And classical computers continue to improve in performance. Recently, one IBM computer successfully simulated a 49-qubit quantum computer. Proving quantum supremacy requires verifying that a classical computer can’t do the problem on similar timescales, or that there isn’t a way to do these calculations on classical computers that scientists just haven’t thought of yet.

Advertisement

There are other players in the game, too, using other architectures. IBM’s qubits look a lot like Google’s, but Microsoft, Intel, and startups like IonQ are pursuing vastly different qubit architectures. Some 50-or-so-qubit, special-purpose quantum simulators that rely on trapped atoms have begun to make useful discoveries in physics.

If the new Bristlecone processor can achieve quantum supremacy, that would be a milestone, perhaps in the way that a baby taking its first steps is a milestone. It’s an exciting sign that the baby will soon have a lot more skills, but it’s not mature yet. And in the case of quantum computing, the baby hasn’t even taken first steps. There’s plenty of work yet to be done.

Advertisement

[via Google Research]

(Thanks to Scott Aaronson at UT Austin for helping me fact-check this post.)

Advertisement

Science Writer, Founder of Birdmodo

Share This Story

Get our newsletter

DISCUSSION

“Google announced its newest 72-qubit quantum computer... but how big a deal is it?”

It’s big. We are witnessing the next step in computer evolution, and maybe in human evolution as well. All computers today employ the Von Neumann architecture, invented by Hungarian-American mathematician John von Neumann (1903-1957) in 1946. Johnny (as he was known to his friends) himself predicted that his architecture would be superseded, but he was wrong; his architecture has endured to this very day.

Johnny wrote extensively about the future of computing before his untimely death at age 53. He predicted that computer codes and genetic codes would in effect merge; machine and human development would become interlinked. Gene sequencing was one of the first applications of his new computer; this has now led to the possibility of selecting the properties of children before they are born.

George Dyson, son of physicist Freeman Dyson, points out that computer networks are now evolving through human input in social media, and indexing by search engines. He also characterize the propagation of information by noting that at its peak in the early ‘90s, fiber optic cable was being laid worldwide at a total rate of 5,000 miles per hour, nine times the speed of sound*.

Quantum computing is the first fundamental advance in computing since 1945, and may very well shape the development of the human race in ways we can’t imagine today.

Below: John von Neumann and the IAS Machine, built at the Institute for Advanced Study in Princeton NJ, 1945-1951; it was the first truly modern computer. One of its initial applications was computing the dynamics of hydrogen bombs. Johnny, a true cold warrior who believed that a nuclear war was inevitable, would be surprised to learn that “...his bomb never exploded, but his computer did”*.

*Turing’s Cathedral, George Dyson, Vintage Press, 2012