Monday, July 28, 2025

Bollocks

 Professor Peter Gutmann, of the University of Auckland, a well-known computer security researcher, has made something of a name for himself as a quantum skeptic, at least with respect to quantum cryptanalysis and post-quantum cryptography (PQC).  His argument is roughly two-pronged:

  1. Quantum computers aren't making progress on factoring at all; and
  2. even if they did, computer security people and cryptographers have much larger problems to worry about.
I agree with him pretty strongly on point #2.  I've said various times that the mathematical vulnerability of Diffie-Hellman key exchange or RSA authentication is not very high on the list of corporate CSOs.  On point #1, I'd say he's technically right about the current state, but rather dramatically wrong about concluding that he can project onward for the next couple of decades and declare the world safe from quantum computers.

Sometime in the last year or two, Peter gave a talk somewhere titled, "Why Quantum Cryptanalysis is Bollocks". The slides themselves have no date or talk venue, but evidence from within them suggests mid-to-late 2024. Some blog postings also assert that date, as does a very recent article in The Register, which is how Peter and this talk came to my attention.  The slides make the talk look like a lot of fun, so I wish I could hear it in person or even via recording.  So let's take a look at the logic in the talk, then I'll tell you why I think he is misunderestimating the quantum folks:

  1. Germany had a massive weapon systems boondoggle during WWII.
  2. OWASP lists a lot (tens of thousands!) of threats to the security of computer systems, and the highest-ranked attack on the mathematics of encryption was #17,245 (not a typo).  Roughly, the argument is:
    1. Success with mathematical attacks are high-effort and low success probability;
    2. even if you succeed, you recover a few bits of the contents of one message; and
    3. of the top ten high-priority problems, when you succeed you win big -- you get "the plaintext of all of the messages."
    4. (And holy cow, not directly in Peter's talk, but there are always examples of how human stupidity is the number one threat!)
  3. NSA can already factor 1,024-bit RSA keys, if they're willing to commit leading-edge supercomputer time in allocation chunks of a year at a time.
  4. Quantum computer-based factoring has grown from 4 bits to 5 bits over the last 20+ years.
  5. Quantum computers are physics experiments, not computers.
  6. (Brings up poorly-regarded D-Wave factoring.)
  7. PQC is very hard to implement well, and costs a lot in engineering time and network/execution time.
  8. (Various disparaging of academics and standards organizations as becoming self-justifying.)
  9. "Quantum cryptanalysis is the string theory of security" and my dog can factor just as well.
Let's see...let me respond one by one:
  1. Yes, and while cautionary tales are good, cautionary tales are not the same thing as prediction or even valid analogy.
  2. Okay, but I think the implied message here is off: if you can crack one Diffie-Hellman key exchange, you gain the ability to read everything in that conversation (n.b.: it's harder than just factoring or discrete log, there are other mechanisms involved), but the bigger catch would be the RSA private key of an important individual, which would allow you to impersonate them across a range of systems; certainly there are organizations that would pay a lot of money for that.  Of course, I'd argue that it's pretty easy for truly high-value targets connecting to high-value systems are likely secured via shared private key, so hacking RSA is lower value. Peter is definitely more knowledgeable than I am in this area.
  3. Okay, but is that relevant to the anti-quantum argument? Is the argument just that people won't really commit big resources to factoring? I'd like to hear the oral argument that accompanies this point.
  4. This is the big one: he's saying progress in development of quantum computers is so poor that we can effectively discount them as any sort of threat. Ooh...okay. It's a fair point that reported successes in the literature on the task of cryptanalysis are advancing at a glacial pace.  (We have worked on this topic.) But projecting from past progress to future progress is dangerous in this field. We have known since the beginning of this field that error correction would be necessary.  Until we hit the threshold that allows quantum error correction to be executed effectively, progress on hardware did not translate into equivalent algorithmic successes.
    Well, guess what? The relentless improvement in hardware means we have passed that basic threshold on at least two very different hardware platforms in the last two years.  At least two other companies have released roadmaps that take us to large-scale, fault-tolerant systems within five years. At that level, that means they think they know how to solve all of the problems standing in their way.  Even if they are off by a factor of two, that still means we're there within a decade, I'd bet sooner.
    So my opinion is that pooh-poohing the likelihood of the advent of cryptographically relevant quantum computers (CRQCs) seems unwise.  I think it's bordering on irresponsible to assume the technology won't happen; the argument instead needs to be about how much to prioritize countermeasures.
  5. In today's environment, strongly agreed.  Dave Farber said to me several years ago (perhaps as far back as 2019, though I think it was a little more recently than that), when I showed him some Qiskit code, "This isn't an application, it's an experiment."  I think we as a community need to think very hard about how to deliver hardware, runtime systems and tools, and applications to customers.
  6. (Pass.)
  7. Cost of PQC is high -- oh, yes, definitely.  I attend IETF meetings and listen to people moan about how hard it is.  I'm not an expert here, though.
  8. (Pass.)
  9. Funny!  (I need a dog.  I love dogs.  But I'm allergic to dogs, work too much and travel too much, and I think dogs in Japan don't have good lives, but all that's for a different posting, some other day...)
Verdict: pundits, and occasionally quantum people, oversell how soon and how big the impact will be -- I'd agree with that.  But the machines are coming.  Make no mistake about that.  So, it's up to you, as a business person, engineer, researcher, bureaucrat, CSO.  How will you respond?

(Also, if you care about broader skepticism of quantum computing, you may want to go look at a blog posting I wrote about four years ago. Geez, time flies.)

No comments: