Quantum computing is a devilishly complex pursuit and one of the main problems is that it demands exotic materials to create its equally exotic circuitry. Now a team of researchers has managed to create the world’s first quantum logic gate in faithful old silicon.
Engineers from the University of New South Wales have developed a device that allows two quantum bits — known as qubits — to communicate with each other. That ability, achieved in silicon, could make a practical quantum computer a reality.
In whatever device you’re using to read this, data is stored as binary bits which assume a state of 0 or 1. In a quantum computer, qubits can assume a state of 0, 1, or both states at once. The qubit’s ability to be both at once in theory allows a quantum computer to perform many computations in parallel, making it incredibly fast.
The problem is that a usable quantum computer needs to be able to perform operations not just within a single qubit but between two. Such dual-bit processes can be used to create what are known as logic gates: simple computational units that take two input values and provide a new output based on a simple rule. In the past such systems have been achieved with qubits — but only using circuitry made from exotic materials.
It’s proven impossible to achieve the same feat on silicon. “The logic qubit state is encoded on the spin of a single electron,” explains Andrew Dzurak, who led the research. “One key issue is that in order to perform logic between two electron spin qubits, the electrons need to be very close to each other, typically within 20-40 nanometers, and this coupling needs to be highly controllable. This has proved very difficult because of the small scales.”
Now, his team has borrowed concepts from existing transistors to make it possible for two silicon-based qubits to reliably communicate with each other. To do that, they’ve essentially taken transistors that aren’t dissimilar to those used in your computer or smartphone and reconfigured them so that each one has only one electron associated with it. Because the state of a quantum bit can be defined by the spin on a single electron, that essentially turns each transistor into a qubit. In a paper published in Nature, the team shows that they can use metal electrodes on the transistors to control the qubits and have them interact with each other.
“A key breakthrough was finding that we could address each qubit independently, just by controlling the voltage on a metal gate electrode above it,” explains Dzurak. “That really simplifies operation of both one- and two-qubit logic.”
Artists impression of a full-scale silicon quantum computer processor, with thousands of individual qubits.
The fact that the team has been able to create this kind of quantum circuitry on silicon is a major advance — mainly because the world of computing is already set up in silicon. The team reckons that because its approach essentially repurposes existing technology, it should be possible to create a full-scale quantum chip far sooner using this approach than any other existing technology could manage.
That’s not to say that it’ll be trivial, though. “There is still a lot of engineering to be done, to achieve the wiring layouts required to do the read and write operations on thousands or millions of CMOS qubits,” explains Dzurak. “While we can borrow a huge amount from CMOS chip design, we’ll have to re-engineer aspects of it, in close collaboration with manufacturers and CMOS chip designers.” Regardless, he reckons that “a chip with between tens and hundreds of CMOS qubits could be made in the next 5 years,” if there’s enough investment in place.
Indeed, taking the technique from the science lab to a working, large-scale chip will require some considerable engineering. That’s why his lab plans to work closely with chip manufacturers and designers to build working prototypes with tens of working qubits. “The lessons learned from that stage... would determine exactly how much longer it would be to produce a chip that would have solve problems that outperform existing computers.,” says Dzurak.
Images by Tony Melov/UNSW