Within days of each other back in 1998, two teams published the results of the first real-world quantum computations. But the first quantum computers werenâ€™t computers at all. They were biochemistry equipment, relying on the same science as MRI machines.

You might think of quantum computing as a hyped-up race between computer companies to build a powerful processing device that will make more lifelike AI, revolutionize medicine, and crack the encryption that protects our data. And indeed, the prototype quantum computers of the late 1990s indirectly led to the quantum computers built by Google and IBM. But thatâ€™s not how it all beganâ€”it started with physicists tinkering with mathematics and biochemistry equipment for curiosityâ€™s sake.

â€śIt was not motivated in any way by making better computers,â€ť Neil Gershenfeld, the director of MITâ€™s Center for Bits and Atoms and a member of one of the two teams that first experimentally realized quantum algorithms, told me.**Â **â€śIt was understanding whether the universe computes, and how the universe computes.â€ť

Computers are just systems that begin with an abstracted input and apply a series of instructions to it in order to receive an output. Todayâ€™s computers translate inputs, instructions, and outputs into switches, called bits, that equal either zero or one and whose values control other switches. Scientists have long used computers to simulate the laws of physics, hoping to better understand how the universe worksâ€”for example, you can simulate how far a ball will go based on where it starts and how fast it is thrown.

Advertisement

But using bits to simulate physics didnâ€™t make much sense to famed physicist Richard Feynman, since the laws of physics at the smallest scale are rooted in a set of rules called quantum mechanics. â€śNature isnâ€™t classical, dammit, and if you want to make a simulation of nature, youâ€™d better make it quantum mechanical,â€ť Feynman famously said at a 1981 conference.

A small band of scientists theorized about using these rules to create better simulations during the decade following. Instead of switches, their quantum simulationâ€™s bits are the dual particle-waves of quantum mechanics. Each individual quantum bit would still be restricted to two choices, but as waves, they can take on either of these states simultaneously with varying strengths, interacting with one another like ocean wavesâ€”either amplifying the strength of certain combinations of choices or canceling combinations out. But once you measure these quantum bits, each one immediately snaps into a single state. Those strengths, or amplitudes, translate into the probability of ending up with each outcome.

Through the early 1990s, â€śpeople thought that quantum computing was essentially mad, and many had [supposedly] proved that it could never work,â€ť Jonathan Jones, a physics professor at the University of Oxford who was one of the first to run quantum algorithms on a real quantum computer, told me. Mainly, people thought it was just a curiosity created by theoretical physicists who wondered whether they could understand the universe itself in the language of computers. It also seemed that the finickiness of quantum mechanicsâ€”the fact that any slight jostle could quickly snap fragile qubits into single-state particlesâ€”would make them impossible to realize.

Advertisement

Two milestones busted those ideas. Physicist Peter Shor unveiled an algorithm in 1994 that showed that a computer based on qubits could factor large numbers near-exponentially faster than the best bit-based algorithms. If scientists could invent a quantum computer advanced enough to run the algorithm, then it could crack the popular modern-day encryption systems based on the fact that itâ€™s easy for classical computers to multiply two large prime numbers together but very, very hard to factor the result back into primes. The second turning point came in the mid-90s when physicists started developing error correctionâ€”the idea of spreading a single qubitâ€™s worth of information across a series of correlated qubits to lessen the errors.

But even after that, the field was small, and the physicists we spoke to discussed conferences at which most of the worldâ€™s quantum computing scientists could fit in a room together. Quantum computing forerunners like Charlie Bennett, Isaac Chuang, Seth Lloyd, and David DiVincenzo were coming up with lots of new ideas that percolated quickly through the community. Almost simultaneously, several independent groups realized that the medical and biochemistry industry had long been using a quantum computer in researchâ€”Nuclear Magnetic Resonance, or NMR spectrometers.

NMR, the technology behind MRI, most commonly consists of a molecule of interest dissolved in a liquid solvent, placed in a strong magnetic field. The nuclei of the atoms in these molecules have an innate quantum mechanical property called â€śspin,â€ť which is essentially the smallest unit of magnetic information, and can be in either of two states, â€śupâ€ť or â€śdown.â€ť These spins align with the direction of the field.

Advertisement

In medicine and biochemistry, scientists will hit the molecules with additional smaller oscillating magnetic fields, called radio-frequency pulses, causing the atoms to release characteristic signals that offer physical information about the molecule. Magnetic resonance imaging or MRI machines instead use this signal to create a picture. But the physicists realized that they could treat certain molecules in this magnetic field as quantum computers, where the nuclei served as qubits, the spin states were qubit values, and the radio-frequency pulses were both** **the instructions and controllers. These are the operations of quantum computers, also called logic gates as they are in classical computers.

â€śIn a sense, NMR had actually been ahead of other fields for decades,â€ť said Jones, a biochemist who teamed up with physicist Michele Mosca to perform one of the first quantum calculations. â€śThey had done logic gates back in the 70s. They just didnâ€™t know what they were doing and didnâ€™t call it logic gates.â€ť

Physicists including Chuang, Gershenfeld and David Cory released papers detailing how to realize these devices in 1997. A year later, two teams, one with Jones and Mosca and another with Chuang and Mark Kubinic, actually performed the quantum algorithms. The former consisted of cytosine molecules where two hydrogen atoms had been replaced with deuterium atomsâ€”hydrogen with a neutron. The latter used chloroform molecules. They prepared the qubits into initial states, performed a computation by applying a specially crafted radio-frequency pulse, and measured the final states.

Advertisement

We donâ€™t often hear about NMR quantum computers today, because even then, physicists knew that the technique had its limits, something all of the physicists I spoke with mentioned. More qubits would mean more specially crafted molecules. The techniques relied on special workarounds such that each additional qubit would make it harder to pick the signal out of the background noise.**Â **â€śNo one thought it would ever be used for more than a demonstration,â€ť Jones said. They just werenâ€™t scalable beyond a few qubits.

Still, they were important experiments that physicists still talk about today. NMR machines remain crucial to biochemistry and still have a place in quantum technology. But this early work has left an important, indirect impact on the field. The science behind those radio-frequency pulses has lived on in the quantum computers that Google, IBM, and other companies have built in order to control their qubits. Quantum computers running Shorâ€™s algorithm are still decades away even today, but companies have begun unveiling real devices with dozens of qubits that can perform rudimentary and clearly quantum calculations.

Advertisement

Charlie Bennet, IBM fellow and quantum computing veteran, explained that these experiments werenâ€™t enormous discoveries on their own, and indeed the NMR community had been advancing its own science and radio-frequency pulses before quantum computing came along. The physicists I spoke with explained that nobody â€śwonâ€ť and there was no â€śraceâ€ť back in the late 1990s. Instead, it was a transition point along a road of incremental advances, a point in time in which groups of scientists all came to realize that humans had the technology to control quantum states and use them for computations.

â€śScience is always like that. The whole evidence is more important than almost any one paper,â€ť said Bennett. â€śThere are important discoveriesâ€”but these rarely occur in single papers.â€ť