News

What Is Quantum Computing? The Complete WIRED Guide

What Is Quantum Computing? The Complete WIRED Guide
Written by Techbot

Like the befuddling math underpinning quantum computing, some of the expectations building around this still-impractical technology can make you lightheaded. If you squint out the window of a flight into SFO right now, you can see a haze of quantum hype drifting over Silicon Valley. But the enormous potential of quantum computing is undeniable, and the hardware needed to harness it is advancing fast. If there were ever a perfect time to bend your brain around quantum computing, it’s now. Say “Schrödinger’s superposition” three times fast, and we can dive in.

The History of Quantum Computing Explained

The prehistory of quantum computing begins early in the 20th century, when physicists began to sense they had lost their grip on reality.

First, accepted explanations of the subatomic world turned out to be incomplete. Electrons and other particles didn’t just neatly carom around like Newtonian billiard balls, for example. Sometimes they acted like a wave instead. Quantum mechanics emerged to explain such quirks, but introduced troubling questions of its own. To take just one brow-wrinkling example, this new math implied that physical properties of the subatomic world, like the position of an electron, existed as probabilities before they were observed. Before you measure an electron’s location, it is neither here nor there, but some probability of everywhere. You can think of it like a quarter flipping in the air. Before it lands, the quarter is neither heads nor tails, but some probability of both. 

If you find that baffling, you’re in good company. A year before winning a Nobel Prize for his contributions to quantum theory, Caltech’s Richard Feynman remarked that “nobody understands quantum mechanics.” The way we experience the world just isn’t compatible. But some people grasped it well enough to redefine our understanding of the universe. And in the 1980s, a few of them—including Feynman—began to wonder whether quantum phenomena like subatomic particles’ probabilistic existence could be used to process information. The basic theory or blueprint for quantum computers that took shape in the ’80s and ’90s still guides Google and other companies working on the technology.

Before we belly flop into the murky shallows of quantum computing 0.101, we should refresh our understanding of regular old computers. As you know, smartwatches, iPhones, and the world’s fastest supercomputer all basically do the same thing: They perform calculations by encoding information as digital bits, aka 0s and 1s. A computer might flip the voltage in a circuit on and off to represent 1s and 0s, for example.

Quantum computers do calculations using bits, too. After all, we want them to plug into our existing data and computers. But quantum bits, or qubits, have unique and powerful properties that allow a group of them to do much more than an equivalent number of conventional bits.

Qubits can be built in various ways, but they all represent digital 0s and 1s using the quantum properties of something that can be controlled electronically. Popular examples—at least among a very select slice of humanity—include superconducting circuits, or individual atoms levitated inside electromagnetic fields. The magic power of quantum computing is that this arrangement lets qubits do more than just flip between 0 and 1. Treat them right and they can flip into a mysterious extra mode called a superposition.

The looped cables connect the chip at the bottom of the structure to its control system.Amy Lombard

You may have heard that a qubit in superposition is both 0 and 1 at the same time. That’s not quite true and also not quite false. The qubit in superposition has some probability of being 1 or 0, but it represents neither state, just like our quarter flipping into the air is neither heads nor tails, but some probability of both. In the simplified and, dare we say, perfect world of this explainer, the important thing to know is that the math of a superposition describes the probability of discovering either a 0 or 1 when a qubit is read out. The operation of reading a qubit’s value crashes it out of a mix of probabilities into a single clear-cut state, analogous to the quarter landing on the table with one side definitively up. A quantum computer can use a collection of qubits in superpositions to play with different possible paths through a calculation. If done correctly, the pointers to incorrect paths cancel out, leaving the correct answer when the qubits are read out as 0s and 1s.

For some problems that are very time-consuming for conventional computers, this allows a quantum computer to find a solution in far fewer steps than a conventional computer would need. Grover’s algorithm, a famous quantum search algorithm, could find you in a phone book of 100 million names with just 10,000 operations. If a classical search algorithm just spooled through all the listings to find you, it would require 50 million operations, on average. For Grover’s and some other quantum algorithms, the bigger the initial problem—or phone book—the further behind a conventional computer is left in the digital dust.

The reason we don’t have useful quantum computers today is that qubits are extremely finicky. The quantum effects they must control are very delicate, and stray heat or noise can flip 0s and 1s or wipe out a crucial superposition. Qubits have to be carefully shielded, and operated at very cold temperatures—sometimes only fractions of a degree above absolute zero. A major area of research involves developing algorithms for a quantum computer to correct its own errors, caused by glitching qubits. So far, it has been difficult to implement these algorithms because they require so much of the quantum processor’s power that little or nothing is left to crunch problems. Some researchers, most notably at Microsoft, hope to sidestep this challenge by developing a type of qubit out of clusters of electrons known as a topological qubit. Physicists predict topological qubits to be more robust to environmental noise and thus less error-prone, but so far they’ve struggled to make even one. After announcing a hardware breakthrough in 2018, Microsoft researchers retracted their work in 2021 after other scientists uncovered experimental errors.

Still, companies have demonstrated promising capability with their limited machines. In 2019, Google used a 53-qubit quantum computer to generate numbers that follow a specific mathematical pattern faster than a supercomputer could. The demonstration kicked off a series of so-called “quantum advantage” experiments, which saw an academic group in China announcing their own demonstration in 2020 and Canadian startup Xanadu announcing theirs in 2022. (Although long known as “quantum supremacy” experiments, many researchers have opted to change the name to avoid echoing “white supremacy.”)  Researchers have been challenging each quantum advantage claim by developing better classical algorithms that allow conventional computers to work on problems more quickly, in a race that propels both quantum and classical computing forward.

Meanwhile, researchers have successfully simulated small molecules using a few qubits. These simulations don’t yet do anything beyond the reach of classical computers, but they might if they were scaled up, potentially helping the discovery of new chemicals and materials. While none of these demonstrations directly offer commercial value yet, they have bolstered confidence and investment in quantum computing. After having tantalized computer scientists for 30 years, practical quantum computing may not exactly be close, but it has begun to feel a lot closer.

Original Article:

About the author

Techbot