Skip to content
Both human and hardware

Quantum computing’s future is almost semi-here—are we ready for it?

As we approach useful hardware, human elements of computing are becoming critical.

John Timmer | 83
The future of computing is... a big metal tank? If it turns out there's just a guy with a laptop in there doing Google searches, I'm going to be very disappointed. Credit: John Timmer
The future of computing is... a big metal tank? If it turns out there's just a guy with a laptop in there doing Google searches, I'm going to be very disappointed. Credit: John Timmer
Story text

YORKTOWN HEIGHTS, NY—I’m in a room with one possible future for computing. The computer itself is completely unimposing, looking like a metal tank suspended from the ceiling. What makes an impression is the noise, a regular metallic ping that dominates the room. It’s the sound of a cooling system designed to take hardware to the edge of absolute zero. And the hardware being cooled isn’t a standard chip; it’s IBM’s take on quantum computing.

In 2016, IBM made a lot of noise when it invited the public to try out an early iteration of its quantum computer, hosting only five qubits—far too few qubits to do any serious calculations but more than enough for people to gain some real-world experience with programming on the new technology. Amidst some rapid progress, IBM installed more tanks in its quantum computing room and added new processors as they were ready. As the company scaled up the number of qubits to 20, it optimistically announced that 50-qubit hardware was on its way.

During our recent visit to IBM’s Thomas Watson Research Center, the company’s researchers were far more circumspect, being clear they weren’t making promises and that 50-qubit hardware is just a stepping stone toward quantum computing’s future. But they did make the case that IBM was well-positioned to be part of that future, in part because of the ecosystem the company is building up around these early efforts.

Building blocks to chips

For its qubits, IBM uses superconducting wires linked to a resonator, all built on top of a silicon wafer. The wire and wafer let the company leverage its experience building circuitry, but in this case, the wire is a mix of niobium and aluminum, which allows it to superconduct at extremely low temperatures. Jerry Chow, who showed us around the hardware testing room, says the company is still experimenting with the details of how to improve its qubits, testing different formulations and geometries individually or in pairs.

The resonator is sensitive to microwave frequencies, allowing each individual qubit to be set or read out using a microwave pulse. Each chip contains optical elements that take external microwave input and direct it to individual qubits. There’s nothing special about the microwaves themselves, so the input is created using off-the-shelf optics. The only challenge is getting the input to the chip, deep in its liquid-helium-cooled tank. The hardware to do so not only has to withstand the extremely cold temperatures but survive being warmed back to room temperature. (Although, once it’s cooled, the hardware can operate indefinitely without replacement.)

Quantum computing relies on entanglement among these qubits. Chow told Ars that, to entangle any two of these qubits, you can rely on the fact that they have slightly different resonant frequencies. If you address each member of a qubit pair using the resonant frequency of its partner, then it’s possible to entangle them. Collections of pairs can then be entangled into higher-order entangled systems. The qubits remain coherent for 100 microseconds at a time, but the entanglement of a qubit pair can be done in about 10 nanoseconds. Chow said entangling a chip currently takes a few microseconds, allowing sufficient time to prepare the whole system and perform calculations.

If it’s all this straightforward, why haven’t we already seen the 50-qubit chip?

Part of the cooling system that’s normally inside these tanks. It uses liquid helium to drop temperatures to near absolute zero.
There are external physical controls for the cooling but not the computing.

The problem is that the qubits are extremely sensitive to environmental noise. This can be noise from outside the device (although the metal tank helps shield the chip from a lot of that). But it can also come from inside—the cooling system, microwave cabling, and the chip components themselves can all interact with the qubits. And any sort of interaction is disastrous for calculations.

That means changing anything about the chip’s architecture, even adding a single qubit, has the potential to change the frequency and type of errors when the chip is performing calculations. IBM does extensive modeling to try to limit this problem before a chip is made, but, to a certain extent, it’s an empirical and iterative process: use a chip and see what happens. “Building more qubits will help us identify sources of noise and crosstalk,” said Chow.

That was echoed by Sarah Sheldon, one of the scientists working on the microwave systems that control and read the qubits. “We have good tools for characterizing individual components but don’t have efficient means of characterizing whole devices,” Sheldon said. “As a system gets bigger, we face situations where control of one qubit may cause errors elsewhere.” Later, she added, “We’re approaching the limit where you can’t simulate these devices classically—how do you tell it’s operating properly?”

Supremacy or volume?

The idea behind a quantum computer is that it will perform some specific types of calculations radically faster than normal computers. With a sufficient number of qubits, a quantum computer could solve problems that would take a traditional computer longer than the Universe has existed. The point where a quantum computer has the capacity to do this has been termed “quantum supremacy.” And, in Google’s announcement of its quantum computing efforts this week, it made explicit reference to the concept.

Of course, Google couched its concept of quantum supremacy in terms of a tolerable error rate (one we wouldn’t tolerate from a traditional computer). IBM, by contrast, is developing error-correcting qubits. Unfortunately, these require several additional qubits to function. Bob Sutor, the IBM VP in charge of this effort, suggested that a quantum computer with a few hundred error-correcting qubits would end up requiring thousands of qubits to function.

Remember, they’re currently still working on 50 non-error-correcting qubits. We’re not going to get to quantum supremacy any time soon.

Instead, IBM suggests we start thinking in terms of “quantum volume,” a measure that combines the number of qubits used for calculations along with the error rate of calculations run on the machine. Quantum volume would allow a meaningful comparison between IBM’s machines and the one described by Google this week. What it won’t tell us is how useful either machine is.

And, to an extent, the answer there is, “It depends.” For some cases, coping with errors is trivial. Factoring a large number into primes, for example, produces a result that can be checked nearly instantly on a classical computer. In other cases, errors would make the result of a calculation unreliable, with no easy way of identifying problems. The current computers, then, leave us in a somewhat odd place. “We can build something that we can’t predict its behavior classically but aren’t fully fault tolerant,” said Jay Gambetta, manager of IBM’s Theory of Quantum Computing and Information Group. “We don’t know what can be calculated using these.”

He noted that many classical computing algorithms were developed first and only later proven to be efficient. By contrast, it’s hard to do any sort of proof for most quantum algorithms.

It is possible to simply go with the statistics: run the algorithm several times (potentially negating some or all of the quantum speed-up) and take the majority answer. But IBM’s response to this has been, in part, to invite the public to try out its computers. If they’re useful for something in their current state, then there’s a chance that someone will figure that out.

SDK for QC

How do you get the public involved with a machine that requires the infrastructure to produce ultra-cold liquid helium and can’t run any existing software? Part of the answer was off on one side of the computing room, in the form of a more traditional Power-based server. The server accepts jobs sent in by people who have signed up to test out the quantum hardware, a group that ranges from huge financial firms to students taking computer science courses.

But IBM is relying on the other part of the answer: a high-level SDK that it is calling QISKit. As Control System Designer Sarah Sheldon describes it, the system’s microwave pulses rely on a collection of arbitrary waveform generators, mixers, and amplifiers. But QISKit abstracts all those details for the system’s users. It allows them to lay out the initial state of the individual qubits and their connections, and the software—a sort of quantum compiler—translates that into the collection of light pulses needed to make things happen. “You will never see microwave pulses,” promised Jay Gambetta.

The layout process is all done in Python, allowing people to leverage existing skills.

The cooling system is closed-loop, so the helium doesn’t need to be replaced.
The cooling system is closed-loop, so the helium doesn’t need to be replaced. Credit: John Timmer

Making things accessible goes a long way toward encouraging users, but its success has left IBM with a community management challenge. Gambetta strongly implied that QISKit and the underlying compiler were pushed out with a focus on having something functional. He discussed how the team needs to make things more modular so that the system can incorporate contributions from its users—according to him, the team has more code contributions than it can actually process. Gambetta also mentioned that he’d like to start seeing the equivalents of code libraries, noting that things like fast Fourier transform implementations have proven useful in solving a huge range of problems.

While IBM may be encouraging these sorts of efforts, Gambetta also hoped they’d emerge naturally. Since its inception, he suggested, quantum computing has largely been in the domain of physics; computer scientists had little reason to get involved since none of the hardware was good enough to let them do any computing. That’s now starting to change, and the involvement of computer scientists could be critical to the field’s development, because, as Gambetta notes, “they think about problems in different ways than the physicists.”

He’s also optimistic about the use of IBM’s hardware by computer science courses. Once quantum computing becomes a normal part of people’s training, then it will be easier for them to view it for what it is: a tool that’s useful for a specific set of problems. At that point, a quantum computer will be similar to a GPU or any other specialized hardware, in that people will just need to decide whether the speed-up it provides will be worth the effort involved in writing the code.

The overall impression from the visit is that quantum computing has reached a transition point. While part of the intent of the visit was clearly to see the hardware, the hardware itself may have become the least exciting part. While progress is being made and qubits are being added, the process is mostly a slow grind of refinement and empirical testing. And the hardware, hidden behind heavily insulated tanks filled with liquid helium, isn’t the most exciting thing to look at.

Instead, the challenge has shifted to figuring out how to get the most out of the hardware that exists and how to use it to ensure that we’re ready for the increasing hardware capabilities. At this point, the human element—building experience and managing communities—is becoming increasingly important.

Listing image: John Timmer

Photo of John Timmer
John Timmer Senior Science Editor
John is Ars Technica's science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.
83 Comments