root labs rdist

October 24, 2008

Quantum cryptography is useless

Filed under: Crypto,Hardware,Security — Nate Lawson @ 1:08 pm

Bruce Schneier is right on the money with this article criticizing quantum crypto.  Quantum cryptography (not quantum computing) is one of those concepts that appeals to the public and the more theoretically-minded, but is next to useless for providing actual security.  Here are some less-known problems with it that I haven’t seen discussed much elsewhere.

Quantum cryptography is focused on one narrow aspect of security: key establishment.  To put it simply, this involves exchanging photons with a particular polarization.  The transmitter randomly sets their polarizer to one angle or the other and sends a photon.  The receiver also tunes their detector to a particular polarization.  The measured value at the receiver is dependent both on the sending and receiving polarizer states, as well as the bit transmitted.  The sender and receiver repeat this process many times in both directions to get a set of bits.  Some will be errors and some will be usable as a key, shared between both but secret.

To receive a photon, an attacker also has to choose a polarization.  By measuring the photon, the state of the attacker’s polarizer is encoded in the photon.  This means that the attacker cannot learn the bits without desynchronizing the sender and receiver.  If the error rate is too high, an attacker is present (or the fibre is bad).

There are a number of well-known issues with quantum crypto.  This key exchange process requires a reliable, non-influencable source of random bits at both the sender and receiver.  Also, there needs to be some kind of pre-existing shared secret between both parties to authenticate themselves.  The only way to do this is through classic crypto (e.g., public key).  Otherwise, the attacker could just splice a receiver and transmitter into the cable and perform a standard MITM attack.  Finally, the actual communication between the two parties is encrypted with a traditional cipher like AES, using the shared key.

I think this alone is enough to undermine the case for quantum crypto.  If you are convinced to spend lots of money and effort to replace classical crypto for the sole purpose of key exchange, shouldn’t standard crypto be considered unreliable for authentication and bulk encryption also?  This is Bruce’s point and is based on simple threat analysis.

Recently, this paper was published describing how bright flashes of light could temporarily overwhelm the detector circuit, allowing an attacker to trick the receiver into accepting bits as unmodified.  The receiver has multiple detectors, each with a different polarization.  Normally, only a photon with the proper polarization triggers the corresponding detector.  But a bright pulse at a particular frequency can temporarily blind multiple detectors, leaving the remaining one to trigger on a subsequent pulse.  By varying the frequency of the bright pulse to select individual detectors, the attacker can manipulate the receiver into thinking the originally transmitted bits are being received correctly.

This is a clever attack because it is based on unstated assumptions.  Quantum crypto doesn’t specify how a perfect detector circuit can be designed.  That’s just a black box.  The designers assume that individual photons can go into the black box and be measured securely.  But it’s not a black box, it’s a real world circuit that depends on optoelectronics and has associated limitations.

You can draw an analogy here to side channel or differential fault analysis.  The AES algorithm might be considered computationally secure, but there are many underlying assumptions in a real-world system that implements it.  There has to be a computer running the algorithm.  It may be realized in hardware or software.  It requires memory (DRAM, disk, core, tape?) to store intermediate state.  It takes up physical space somewhere.  It gets its input over some kind of busses.  The environment may be hostile, with varying voltage, heat, clock frequency, etc.

What other attacks could there be?  Is it possible to determine which polarizer was selected in the transmitter by probing it with light while it is preparing to send the photon?  Does it consume more current when switching from one state to another?  Are there timing variations in the transmitted photons based on whether the polarizer switched state or not?

Classical crypto implementations have been continually hardened in response to these kinds of attacks.  I think the perceived theoretical invulnerability of quantum crypto has resulted in less attention to preventing side channel or fault analysis attacks.  In this sense, quantum crypto systems are less secure than those based on classical crypto.  Given its cost and narrow focus on strengthening only key exchange, I can’t see any reason for using quantum crypto.

5 Comments

  1. Just like to point out that authentication of QKD can be done with Authentication Code (AC), which is not based on any computational assumption. Also AES is not a must for data encryption, you can use the Vernam cipher (one-time pad), which is “unconditionally secure” (or information-theoretically secure). Thus the whole encryption is “information-theoretically secure”. But the price is also obvious: the pre-distributed key size (for authentication) is large, and not practical at all. Sometimes, people just need another choice, which is not necessarily taken though.

    Comment by hi — October 26, 2008 @ 10:21 pm

  2. I believe you’re referring to Authentication of Quantum Messages by Barnum et al. Does any product actually use this with QKD? It seems like it requires a large, pre-distributed shared secret. And it is based on one computational assumption, namely the security parameter “s”. I think you mean its security is not based on a number-theoretical problem (like RSA and factoring). The value you choose for “s” (128 = 2^128 complexity) does depend on the level of work you think your opponent can perform.

    One-time pad is not very usable with QKD systems since you have to exchange 1 bit per bit of message you encrypt. At distance, this is only a few hundred bits per second. There may be attempts to increase the bit exchange rate but I doubt OTP encryption is used for anything other than the smallest messages.

    Comment by Nate Lawson — October 28, 2008 @ 9:42 am

  3. >> The value you choose for “s” (128 = 2^128 complexity) does depend on the level of work you think your opponent can perform.

    In the paper you referred, s is the security parameter, namely the adversary learns the secret key with probability 2^{-s}, and it assumes no computational power for the adversary. Actually, the paper shows if you can do unconditionally secure authentication, you can already perform perfect encryption.

    QKD can be performed without computational assumption, however, it does need some other assumptions: e.g., “current quantum physics hypothesis is correct”. Historically, if a scheme is without computational assumption, it is also called “unconditionally secure”, which is quite misleading.

    QKD gets weak when the distance increases, as for current technologies.

    I agree with you that QKD is not practical at all, and QC cannot solve all the problems we have in hand.

    Comment by hi — October 28, 2008 @ 7:53 pm

  4. Thanks for the comments. Trevor referred me also to the original universal families of hash functions paper by Wegman-Carter. Here is a great description of this.

    The surprising result is that randomly selecting from a family of non-cryptographically-secure hash functions, and encrypting the “selector” as well as the hash result gives an unconditionally secure authentication scheme. This is similar to the basis for Galois counter mode.

    I may have to write a post about this in the future.

    Comment by Nate Lawson — October 29, 2008 @ 5:12 pm

  5. thx1.0e3

    which brings us back to using any algorithm such as that described still works on the simplistic concept like the DOD logx to x and this to thjat key crap….

    I rather see utilization of biological living tissue to perform these calculations…DNA is a great computer
    and man can two carrots fly….

    jezz when was that Princeton research on dna calculating my flight from la to ny hehehehhehe like 1986

    Comment by CM — December 7, 2008 @ 6:57 pm


RSS feed for comments on this post.

The Rubric Theme. Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 85 other followers