What Counts as Information?

In 1943, the telephone in Winston Churchill’s war room was a problem.

The transatlantic calls to Roosevelt were encrypted by a system called SIGSALY, fifty tons of equipment that digitized speech, scrambled it with one-time pads, and reassembled it on the other end. The voice channel was mathematically unbreakable. The Germans couldn’t decode a single word.

But Bell Labs engineer A.B. Clark noticed something troubling. The encryption scrambled the content of Churchill’s speech. It didn’t scramble the rhythm. The timing of syllables, the cadence of sentences, the patterns of pauses: all of it passed through intact.

Clark realized that rhythm alone might be enough to reconstruct words. He was right. The phenomenon would later be called “residual intelligibility.” The secret wasn’t in the message. It was in the shape of the message.

SIGSALY was redesigned to mask timing patterns. The fix added more equipment. More weight. More complexity. All to hide something no one had thought to protect: the rhythm.

This is the story of the twentieth century’s long education in what counts as information.


The Beeping Safe

The lesson had to be learned many times.

In the 1980s, hotel room safes used a simple four-digit code. Guests would set their own combination, and the safe would beep once for each correct digit as you entered it.

Thieves figured it out immediately. Start with 0000. If the safe beeps once, the first digit is 0. If not, try 1000. Work through 0-9 until you hear a beep. Then move to the second digit.

A four-digit safe has 10,000 combinations. These safes fell in 40 tries.

The safe protected the code. It didn’t protect information about the code: how many digits were correct. That distinction cost everything.


The Collapse

Here is the mathematics of what happens when structure leaks.

A password with $n$ positions, each drawn from an alphabet of size $k$, has $k^n$ possible values. For a 4-digit PIN:

\[10^4 = 10{,}000 \text{ attempts}\]

But if the system reveals whether each position is correct independently, the attacker solves each position separately:

\[k \times n = 10 \times 4 = 40 \text{ attempts}\]

The exponent becomes a multiplier. Security collapses from $k^n$ to $kn$.

Secret length Without leak With leak Collapse
4 digits 10,000 40 250×
8 lowercase 208 billion 208 1 billion×
12 mixed $10^{21}$ 744 $10^{18}$×

The longer the secret, the more catastrophic the leak. This is not a bug in one safe. It’s a theorem about information.


See It

Without leak:10,000
With leak:40
Collapse factor:250×

Paul Kocher’s Discovery

For decades after the beeping safe, cryptographers believed they had learned the lesson. Don’t leak partial information. Make verification all-or-nothing.

They were wrong. They had only learned half the lesson.

In 1996, a twenty-three-year-old cryptographer named Paul Kocher published a paper that redrew the boundaries of information security. He showed that even when software reveals nothing explicitly, the time it takes to compute is itself a channel.

Consider password verification:

def check(input, stored):
    for i in range(len(stored)):
        if input[i] != stored[i]:
            return False
    return True

This code returns as soon as it finds a mismatch. A wrong first character fails in one microsecond. A wrong last character takes eight microseconds.

The timing is the beep.

Kocher demonstrated the attack against RSA implementations. By measuring how long decryption took for different inputs, he could extract private keys bit by bit. The attack worked remotely. It worked through noise. It worked against code that had been audited for security and revealed nothing through any official channel.

The cryptographic community was stunned. They had protected the message. They hadn’t thought to protect the duration of the computation.


The Expanding Boundary

Kocher’s work opened a floodgate.

If timing is information, what else might be?

1998: Power analysis. Kocher again. The power consumed by a chip varies with the operations it performs. Different instructions draw different current. By measuring power consumption during cryptographic operations, attackers could extract keys from smart cards.

1985 (published much later): Van Eck radiation. Every wire is an antenna. CRT monitors emit electromagnetic radiation proportional to what they display. Wim van Eck showed you could reconstruct screen contents from across the street. Later researchers demonstrated the same for LCD screens, keyboards, and network cables.

2003: Cache timing. Modern CPUs have memory caches. Accessing cached data is fast; uncached data is slow. An attacker who shares your CPU can deduce what memory addresses you’re touching, including cryptographic key material.

2018: Spectre and Meltdown. CPUs speculate about which instructions to execute next. Sometimes they guess wrong and roll back. But the speculative execution leaves traces in the cache. These traces leak secrets across process boundaries, across virtual machines, across the boundary between user code and the operating system kernel.

Each discovery expanded the definition of “information.” First it was the message. Then it was metadata. Then timing. Then power. Then radiation. Then cache state. Then speculative execution.

The boundary is still moving.


Shannon’s Definition

The history of side channels is a history of expanding the concept of information itself.

Shannon defined information as reduction in uncertainty. A message conveys information because it tells you something you didn’t know. By this definition, anything that reduces uncertainty is information, whether or not it was intended to communicate.

The safe’s beep reduces uncertainty about which digit is correct. The computation’s timing reduces uncertainty about which branch was taken. The chip’s power draw reduces uncertainty about which operations occurred. The radiation from a cable reduces uncertainty about what it’s carrying.

None of these channels were designed. All of them communicate.


The Shape of Everything

Netflix and traffic analysis. Your video stream is encrypted. An eavesdropper sees only noise. But the pattern of the traffic (packet sizes, timing, bitrate changes) correlates with what you’re watching. Researchers have identified specific movies from encrypted Netflix streams with over 99% accuracy.

Keystroke dynamics. The rhythm of your typing (how long you hold each key, the gaps between keystrokes) identifies you as reliably as a fingerprint. This works through encryption, through Tor, through any anonymizing layer.

Website fingerprinting. HTTPS hides which page you’re viewing but not the pattern of requests: how many resources, what sizes, what timing. The pattern is enough to identify the site with high accuracy.

These are not attacks in the traditional sense. No lock is picked. No cipher is broken. The information was there all along, in the shape of the activity.


Your Rhythm

Type anything below. The content doesn't matter. Your rhythm is the signal.

Mean interval: ms
Std dev: ms
Signature:

Different people produce different patterns. The text vanishes into encryption. The signature persists.


The Arms Race Has No End

The natural response is: fix the leaks. And cryptographers have tried.

Constant-time code. Compare all characters, even after finding a mismatch. Make every branch take the same time. Never let computation depend on secret data.

Blinding. Add random noise to inputs. The answer is the same, but the intermediate steps are randomized, masking power and timing patterns.

Shielding. Faraday cages block electromagnetic radiation. Expensive. Imperfect. A well-funded adversary can still get through.

Each fix addresses one channel. But physics guarantees there are always more.

Computation requires energy. Energy dissipates as heat, radiation, vibration. Any physical process leaves traces. The question is not whether information leaks, but whether an adversary can detect it.

For the NSA, the threshold is low. For a random thief, high. But the threshold keeps falling as sensors improve and analysis gets cheaper.


The Moving Boundary

Here is what the twentieth century taught us:

Information is not what you intend to reveal. It is what can be inferred from what you do.

The boundary between “the secret” and “information about the secret” is not fixed. It moves as our ability to measure improves. What leaked undetectably in 1980 leaks detectably in 2000. What seems safe today will leak tomorrow.

This is not paranoia. It is physics.

Churchill’s engineers thought they were protecting speech. They were forced to protect rhythm. Cryptographers thought they were protecting keys. They were forced to protect timing, power, radiation, cache access, and speculative execution.

Each generation discovers that the last generation’s definition of “the secret” was too narrow.


Privacy, Anonymity, AI

Privacy. Every digital action has a shape: timing, size, frequency. Encryption hides content but not shape. Metadata is data.

Anonymity. Behavioral patterns are signatures. How you type, how you move your mouse, how you browse. These patterns persist across contexts. True anonymity requires suppressing not just identity but behavior, and behavior is hard to fake.

AI. Machine learning excels at finding patterns humans miss. The leaks that were too subtle to exploit manually become tractable with enough data and compute. The attacker’s threshold keeps dropping.


The Lock Will Always Talk

Return to the beeping safe.

The fix seems obvious: remove the beep. But the beep was not the problem. The problem was that the mechanism processed each digit separately, and that separation was detectable.

Make the electronics silent and the timing still differs. Equalize the timing and the power draw still differs. Shield the power and the EM radiation still differs. Shield everything and the fact that you’re standing in front of the safe still reveals something.

Every physical process is a broadcast. Security is not about stopping the broadcast. It is about ensuring the broadcast conveys nothing useful.

This is hard. It requires understanding what information is: not what you intend to transmit, but what can be received.

The safe’s designers protected the code. They didn’t think to protect the rhythm of the mechanism, the heat of the electronics, the sound of the buttons.

Churchill’s engineers protected the words. They didn’t think to protect the cadence.

The lock will always talk. Security is teaching it to say nothing useful.