Quantum Bits is intended to introduce quantum computing terms in a fun and accessible way. It is a personal project of Yuval Boger, whose day job is Chief Commercial Officer at QuEra Computing. You might also know Yuval from his Superposition Guy's Podcast series (listen here)

Subscribe on Substack at https://qubitguy.substack.com/ to get both comics and podcasts.


 

Latest Posts

Hadamard Gate

A quantum comic strip about hadamard gate

Quantum computers manipulate qubits through operations called gates, and the Hadamard is one of the most important. It is often the first operation in a quantum algorithm, and understanding it unlocks much of what follows in this book.

Here is what it does: it takes a qubit in a definite state, say 0, and transforms it into an equal superposition of 0 and 1. Mathematically, the qubit’s amplitude is split evenly between both possibilities. If you immediately measure the qubit after applying a Hadamard gate, you get 0 or 1 with exactly 50-50 odds.

Why is this useful? Because superposition is the raw material that quantum algorithms need. Before a quantum computer can exploit interference to amplify correct answers and suppress wrong ones, it must first create superpositions to work with. The Hadamard gate is the standard tool for that job.

One elegant property: applying a Hadamard gate twice returns the qubit to its original state. The first application creates superposition; the second undoes it. This reversibility is characteristic of quantum gates in general and reflects a deep principle: quantum operations (other than measurement) are always reversible.

How the gate is physically implemented depends on the hardware. In superconducting systems, it is a precisely calibrated microwave pulse applied to a tiny circuit. In trapped-ion computers, it is a laser pulse tuned to a specific atomic transition. In neutral-atom systems like those built by QuEra, it is similarly a laser pulse, but applied to atoms held in optical tweezers. The physics differs, but the mathematical effect is identical across all platforms.

When you apply Hadamard gates to many qubits at once, you create a superposition over all possible combinations of 0s and 1s. This is the starting configuration for algorithms like Grover’s search and the Deutsch-Jozsa algorithm, where the computation begins by exploring all possibilities equally before interference narrows the outcomes.

Subscribe on Substack at https://qubitguy.substack.com/

Quantum Teleportation

A quantum comic strip about quantum teleportation

Quantum teleportation transfers the exact quantum state of one particle to another particle at a distant location, without physically moving anything between them. Despite the name, it has nothing to do with science fiction transportation. It is a precisely defined quantum information protocol, first proposed by Charles Bennett and colleagues in 1993 and experimentally demonstrated in 1997.

The protocol requires three ingredients: the source qubit whose state you want to transfer, a pair of entangled qubits (one held by the sender, one by the receiver), and a classical communication channel.

The sender performs a joint measurement on the source qubit and their half of the entangled pair. This measurement yields two classical bits of information and destroys the source qubit’s quantum state. The measurement also instantaneously changes the state of the receiver’s entangled qubit, but in a way that depends on the measurement outcome. The sender transmits the two classical bits to the receiver through an ordinary channel. The receiver applies a specific quantum gate chosen based on those two bits. After this correction, the receiver’s qubit is in exactly the state the source qubit originally held.

The original state is destroyed in the process, consistent with the no-cloning theorem: information is moved, not copied. The protocol cannot transmit information faster than light, because the receiver must wait for the classical message. And the entangled pair is consumed: each teleportation requires a fresh one.

Quantum teleportation is a foundational building block for quantum networking, enabling quantum information to be transmitted between distant nodes without exposing it to decoherence in a physical channel. It also appears in certain quantum computing architectures and in error correction protocols. It is not a curiosity. It is infrastructure.

Subscribe on Substack at https://qubitguy.substack.com/

Quantum Measurement

A quantum comic strip about measurement

Carpenters say “measure twice, cut once.” Quantum physicists would say “measure once, and whatever you had is gone.” Measurement in quantum mechanics isn’t a gentle peek at what’s inside. It’s the end of the road for the superposition you worked so hard to create. Quantessa explains:

Measurement is where quantum mechanics meets classical reality. It is the act of extracting information from a qubit, and it is irreversible.

A qubit in superposition carries amplitudes for both 0 and 1. These amplitudes determine the probability of each outcome when measured: if the amplitude for 1 is larger, measuring 1 is more likely. But measurement does not simply reveal a pre-existing value. It forces the qubit to commit. The superposition collapses, the qubit becomes a definite 0 or 1, and the quantum state that existed before measurement is gone permanently. If you measure the same qubit again, you will get the same result every time. The superposition is not hiding somewhere waiting to return. It has been destroyed.

This is not a limitation of our instruments. It is a fundamental feature of quantum mechanics. The information encoded in the amplitudes and phase of a superposition cannot be fully extracted by measurement. You get one classical bit out. The rest is lost.

This creates the central design constraint of quantum computing. During computation, qubits must remain unmeasured to preserve their superpositions and entanglement. At the end, measurement must occur to extract the answer. The entire art of quantum algorithm design is arranging the computation so that when measurement finally happens, the correct answer appears with high probability — not certainty, but close enough that running the algorithm a few times all but guarantees it.

Measurement also plays a constructive role. In quantum error correction, ancilla (helper) qubits are measured mid-computation to detect errors on data qubits. These measurements are carefully designed to reveal information about errors without revealing (and thus destroying) the protected quantum information. Mid-circuit measurement combined with feed-forward, where subsequent operations depend on measurement outcomes, enables real-time error correction and is a prerequisite for fault-tolerant quantum computing.

Subscribe on Substack at https://qubitguy.substack.com/

Entanglement

Entanglement is not just what happens when you pull your charging cables from your bag and find they are deeply connected. Some physicists would call that “string theory.” Quantessa explains what entanglement means in the quantum world.

A quantum comic strip about entanglement

In 1935, Einstein, Podolsky, and Rosen published a paper arguing that quantum mechanics must be incomplete. Their objection centered on a phenomenon that Einstein called “spooky action at a distance.” Today we call it entanglement, and it has become one of the most powerful tools in quantum computing.

When two qubits become entangled, their quantum states are no longer independent. Measuring one qubit instantly tells you something definite about the other, regardless of the distance between them. For example, in one type of entangled state called a Bell state, the qubits are perfectly anti-correlated: if you measure the first as 0, the second will definitely be 1, and vice versa. Other Bell states and entangled states produce different correlations: the qubits might always match instead, or their relationship might be more complex.

This is not the same as flipping two coins that were secretly programmed to match. In the 1960s, physicist John Bell established a mathematical limit on how strongly correlated two particles can be if their outcomes are predetermined. Entangled particles exceed that limit. Experiments have confirmed this repeatedly. The correlations are genuinely quantum, with no classical explanation.

For quantum computing, entanglement transforms a collection of qubits into something more powerful than the sum of its parts. It is what makes the state space of n qubits grow as 2^n rather than simply n. Without it, qubits would be independent coin flips offering no advantage over classical bits. With it, quantum algorithms can set up interference patterns across that exponentially large space, amplifying correct answers and suppressing wrong ones. This is what enables many quantum algorithms to outperform their classical counterparts.

Maintaining entanglement is difficult. Any interaction with the environment can cause decoherence, breaking the delicate quantum correlations between qubits. The practical challenge of preserving entangled states long enough to complete a computation is one of the defining problems in quantum engineering today.

Subscribe on Substack at https://qubitguy.substack.com/

World Quantum Day

Comic about world quantum day

You probably know about Pi Day on March 14th. We use the first three digits of pi to celebrate math. World Quantum Day works the exact same way. Scientists observe it on April 14th to honor a fundamental rule of nature called Planck’s constant. If you write this tiny number out in certain scientific units, it starts with 4.14. This number proves that energy does not flow in a smooth, continuous stream. Instead, energy arrives in small, separate packets called quanta. Think of it like walking up a wooden staircase instead of sliding up a smooth ramp. You can only place your foot on a specific stair, never float in the empty space between them. Everything in quantum computing relies on this staircase rule.

We celebrate this date to help everyone understand these invisible rules of nature. Right now, quantum computers are still fragile and mostly sit inside quiet research labs. They cannot run complex programs or solve major global problems today. However, researchers are building better machines piece by piece. World Quantum Day reminds us that we need fresh ideas from people outside the traditional physics world. It is an open invitation to learn how the universe actually operates at its smallest level. We need curious students to step up and help turn these basic rules into the useful computing tools of tomorrow.

Quantum Error Correction

A simplistic view of the progression of quantum computers has three stages, each with a question:

Stage 1: Can quantum computers be built at all, regardless of the quantity and quality of qubits?

Stage 2: Can errors be detected and corrected?

Stage 3: Can quantum computers scale to a sufficiently large number of high-quality qubits?

Error correction is the key question in stage 2. Quantessa explains:

Quantum computers are extremely sensitive. The qubits that store information can be thrown off by the tiniest disturbance: a stray vibration, a small temperature fluctuation, even leftover electromagnetic noise. When a qubit picks up an error, the computation goes wrong. And unlike a classical computer, you can’t just check a qubit’s value to see if it’s correct, because measuring it destroys the superposition you’re trying to protect.

Quantum error correction is a clever workaround. Instead of storing one piece of information in one qubit, you spread it across a group of qubits. These extra qubits don’t hold their own data; they work together so the system can detect when something has gone wrong without directly looking at the protected information. Think of it like a group project where five people each know part of the plan. If one person gets confused, the others can compare notes and figure out what changed, then fix it, all without revealing the full plan to an outsider. The “logical qubit,” the reliable unit of information, emerges from the teamwork of many “physical qubits.” This is one of the biggest engineering challenges in quantum computing today. Building machines with enough high-quality physical qubits to make error correction work at scale is what separates experimental quantum devices from the fault-tolerant quantum computers that will eventually tackle real-world problems.

Looking for a more detailed description? Find it at quera.com/glossary

Grover’s Algorithm

Grover's Algorithm

Some say that there are just two quantum algorithms today: Grover’s and Shor’s. While Shor’s gets all the headlines, Grover’s stands out because of its elegant simplicity. See a beautiful video explaining its operation here, created by 3Blue1Brown.

By way of history, Lov Grover published his search algorithm in 1996, and it remains one of the foundational results in quantum computing. The problem it solves is simple: given an unsorted collection of items and a way to check whether any given item is the one you want, find the correct item. A classical computer must check items one at a time. Grover’s algorithm finds the answer using roughly the square root of the attempts a classical approach would need.

Think of searching for a specific card in a shuffled deck. A classical computer needs, on average, 26 checks for a 52-card deck. Grover’s algorithm finds the same card in about 7 checks, roughly the square root of 52.

The algorithm begins by creating a superposition that assigns equal amplitude to every possible answer. It then repeatedly applies two operations. First, an oracle marks the correct answer by flipping its quantum phase, turning its amplitude negative while leaving all other amplitudes unchanged. Second, a diffusion operator compares every amplitude to the overall average and reflects them around it. Because the correct answer’s amplitude was flipped negative, this reflection pushes it sharply upward while suppressing the incorrect answers. This is quantum interference in action: the mathematical structure of waves causes wrong answers to cancel out while the right answer reinforces with each iteration.

After about seven iterations for our 52-card example, the correct answer’s amplitude dominates, and a measurement will return it with high probability. If the problem has multiple correct answers, the algorithm still works and actually converges faster, since more marked states means more amplitude to reinforce.

Two important limitations apply. The square root speedup is provably the best possible for unstructured search, making it a polynomial rather than exponential advantage. And the algorithm requires an oracle that can recognize correct answers, meaning you must be able to verify a solution even if you cannot find one directly.

While running Grover’s algorithm on large problems requires more capable quantum hardware than exists today, its underlying technique of amplitude amplification has become a building block inside many other quantum algorithms, from optimization to cryptography.

Subscribe on Substack at qubitguy.substack.com
Looking for a more detailed description? Find it at quera.com/glossary

Neutral Atoms and Quantum Computing

In a week when Google announced it is starting a neutral-atom group, Atomique had two burning questions:

  1. What about IBM? After all, Amazon, Microsoft and Google all have a neutral-atom strategy.
  2. What are neutral atoms anyway?

Quantuessa will answer the second question.

BTW, I sometimes get asked how I came to join QuEra and work on neutral atoms. About four years ago, I met Nate Gemelke at a UMD event. Nate is co-founder and now Chief Technology Strategist at QuEra. He explained that QuEra makes “analog Hamiltonian simulators”. I understood each word separately, but wasn’t sure about the whole sentence. Later, when I was looking for my next quantum adventure, I recalled that conversation and thought: QuEra seems to have really nice and smart people, and very cool technology, but there must be a better way to market this to the world. So I offered my services.

But back to our regularly scheduled programming: What are neutral atoms, and how are they related to quantum computers?

There are several ways to build a quantum computer, and they differ in what physical object serves as the qubit. Some approaches use tiny currents in superconducting circuits. Others trap individual charged atoms, called ions, using electric fields. Neutral atom quantum computers use a different approach: they hold individual atoms that carry no electric charge, suspended in place by focused laser beams called optical tweezers.

Because these atoms are neutral, they don’t repel or attract each other the way charged particles do, which makes them naturally well-isolated from unwanted interference. The laser tweezers can arrange atoms into precise patterns, hundreds or even thousands at a time, and rearrange them on the fly. To make two qubits interact (which is essential for computation), the atoms are briefly excited into a high-energy state called a Rydberg state, where they temporarily influence each other across short distances. When the operation is done, they settle back down. This gives neutral atom systems a combination of advantages: large numbers of qubits, flexible connectivity between them, and the ability to operate at relatively modest infrastructure requirements compared to approaches that need extreme cooling to near absolute zero. Neutral atom quantum computing is still maturing, but it has emerged as one of the leading approaches for building the large-scale, error-corrected machines that will eventually tackle problems beyond the reach of classical computers.

Looking for a more detailed description? Find it at quera.com/glossary

Quantum Hype

Imagine walking through an auto show and seeing a sleek, glowing concept car. The builder promises it will soon fly you to work while you sleep. That sounds amazing, but the cars actually driving outside are still running on gas and struggling with traffic. Quantum computing faces a similar gap between grand promises and daily reality. This gap is called quantum hype. You often read headlines claiming these new machines will instantly cure diseases or break all internet security tomorrow. The truth is much slower. Scientists are still struggling to build machines that can perform basic tasks without making constant errors.

Currently, these computers are incredibly sensitive. A slight change in room temperature can ruin a calculation. We do not have machines that can replace your regular laptop. Instead of a magical problem solver, a quantum computer today is more like a delicate science experiment. Researchers spend most of their time just figuring out how to keep the machine stable. Cutting through this exaggeration matters because real scientific progress requires patience. If people expect miracles by next year, they might abandon the technology when those miracles fail to arrive. By focusing on actual engineering hurdles instead of science fiction, scientists can secure the steady support they need. This long-term work might eventually help us build reliable computers to design better batteries or discover new medicines.

Quantum Superposition

In everyday life, things have definite states. A coin on a table is either heads or tails. But at the quantum scale, particles don’t work that way. A quantum particle like an atom or an electron can be prepared so that its state isn’t determined yet. It has a set of probabilities for different outcomes, and only when you measure it does it land on a specific result. This is superposition: not “being in two states at once,” but existing in a state where the outcome is genuinely undetermined, with precise mathematical probabilities for each possibility.

What makes this useful, rather than just weird, is that quantum computers can manipulate these probabilities. A quantum algorithm carefully adjusts the probabilities across many qubits so that when measurement finally happens, the right answer is likely and the wrong answers mostly cancel out. It’s a bit like tuning a musical instrument so the note you want rings loud and the noise fades away. This ability to work with probabilities before measurement, rather than with fixed values, is what gives quantum computing its potential advantage for problems like molecular simulation, optimization, and cryptography. Superposition isn’t magic; it’s a precisely controllable physical property, and learning to harness it is what the entire field of quantum computing is built on.

Looking for a more detailed description? Find it at quera.com/glossary