The Future Is Here
We may earn a commission from links on this page

We've Entered a New Era of Quantum Computing

We may earn a commission from links on this page.

Quantum computing might be nascent, but recent advancements have brought us into a new age. And every new era needs a name. So when future computing historians look back on the era starting around 2017, they’ll have a word to describe it: the NISQ era.

The NISQ era, or Noisy Intermediate-Scale Quantum Technology era, is a term coined by John Preskill at last month’s Quantum Computing for Business (Q2B) conference held at NASA Ames in California. Business leaders from Fortune 500 companies met with quantum computing experts (and watched me moderate a panel) to learn about how and when they should expect quantum computers to have useful, real-world applications. Preskill has now published a paper based on his Q2B lecture about what this “NISQ” era entails.


In a sentence: “Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of today’s classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably,” Preskill writes. Let’s explain what that means.


Quantum computers are computers that solve problems based on an entirely different architecture from regular computers. Regular, or classical computers, solve problems by translating data into billions and billions of interacting “bits,” or physical systems that represent two different states, like on-off switches that depend on other on-off switches. Quantum computers instead use qubits, or quantum bits, that equal some probability of both zero and one simultaneously while the computation is occurring. Qubits talk to each other during a computation using the rules of quantum mechanics, with concepts like “quantum interference” and “quantum entanglement.” A quantum computation kind of looks like flipping coins tied together.

Google will imminently declare that they’ve achieved “quantum supremacy,” a term that Preskill himself coined in 2012. This means they will build a quantum computer that can tackle a particular problem markedly faster than a regular computer can. This would require 50 or so qubits, and present-day classical computers wouldn’t be able to simulate such a quantum computer. But there’s still a number of challenges in the way of quantum computing becoming truly revolutionary.

So, what’s this NISQ era we’ve entered into? Preskill explained that these quantum computers will have between 50 to a few hundred qubits. But these qubits will be noisy—they can quickly collapse into regular bits, or return an incorrect value. “The noise will place serious limitations on what quantum devices can achieve in the near term,” Preskill writes.

Preskill (admittedly naively) assumes that until some sort of quantum error correction exists allowing for robust qubits and qubit interactions, NISQ-era quantum computers won’t be able to reliably execute circuits with much more than 1,000 two-qubit operations. Then, there are still errors in these computers’ ability to read the final answer, and questions about how to scale quantum computers up.


What use will quantum computers have in this NISQ era, then? Most importantly, they’ll be able to simulate the interactions of many particles better than classical computers can. “Valuable insights might already be gleaned using noisy devices [with around] 100 qubits,” explains Preskill.

We’ll also find out whether they can solve optimization problems better than regular computers can. These kinds of problems are basically find-the-best-solution-to-a-complex-thing, like how to send a fleet of cabs onto the road without having their paths cross. We’ll also find out whether D-wave’s specialized quantum computer, called a quantum annealer, is any faster than a classical computer in solving these kinds of problems.


It’s less likely, but possible, that we’ll see advantages in machine deep learning, a form of artificial intelligence. But it’s unlikely that we’ll see quantum-resistant cryptography or quantum computer networks in the near term.

For what it’s worth, people seem to think Preskill’s lecture was pretty good. Scott Aaronson, theoretical computer scientist at the University of Texas at Austin, wrote in his blog: “Did you ever wish you had something even better than a clone: namely, someone who writes exactly what you would’ve wanted to write, on a topic people keep asking you to write about, but ten times better than you would’ve written it?”


You should read Preskill’s lecture because it’s geared at people with only an introductory knowledge of quantum computers. But in summary, the NISQ era is simply an age where clunky quantum computers do things that classical computers can’t. These quantum computers still have too many drawbacks to make them revolutionary; as Preskill himself writes, it might be several decades before quantum computers have “transformative effects on society.”

Until then, researchers and companies will continue to find ways to make their qubits more resilient and scale up their computers, mathematicians will keep looking for algorithms where quantum computers show a benefit, and physicists will use these machines to do better simulations. And it’s going to take work to get us into the next era. Preskill concludes:

Quantum technology is rife with exhilarating opportunities, and surely many rousing surprises lie ahead. But the challenges we face are still formidable. All quantumists should appreciate that our field can fulfill its potential only through sustained, inspired effort over decades. If we pay that price, the ultimate rewards will more than vindicate our efforts.


[arXiv via Shtetl Optimized]