Love me some Peter Gutmann. This is classic Gutmann.
A lot of this presentation is mooted by understanding PQC as an scientific question rather than an engineering one. What are the precise natures of quantum-superior attacks on cryptosystems and what are key establishments and signatures that resist those attacks? Whatever else you think of quantum cryptanalysis those are undeniably important theoretical questions.
A few more slides are mooted by the likelihood that any mainstream deployed PQC system is going to be hybridized with a classical cryptosystem.
As an articulation of a threat model for modern computing, it simultaneously makes some sense and proves too much: if you think OWASP-type vulnerabilities are where everyone's head should be at (and I sort of agree), then all of cryptography is a sideshow. I'm a connoisseur of cryptographic vulnerabilities that break real systems the way SQL injection does (a bitflipping attack on an encrypted cookie, a broken load-bearing signature scheme) but even I have to admit there's 1 of those for every 10,000 conventional non-cryptographic attack.
But of course, it also depends on who your adversary is. Ironically, if you're worried about state-level SIGINT, the barrier for OWASP-style attacks may be higher than that of large-scale codebreaking; passive interception and store-now-decrypt-later is the SIGINT love language.
My biggest thing with all of this is a core belief about organizations like NSA: that they exist primarily to secure budget for NSA. Given that, the one thing you absolutely don't want to do is have a system that is breakable only at almost-implausible cost.
(Also, his RSA-1024 analysis is off; it's missing batch attacks).
Reading Gutmann I see a disconnect from reality. Yes a cryptographically relevant quantum computer may never materialize, nor a useful quantum computer at all. These are interesting research topics and interesting discussions. But it doesn't matter. Using that as arguments against PQC is moot, no longer relevant.
The decision to move the world to PQC has already been taken, and we are in full transition mode. Orgs like NATO has transitioned. Many have published time plans to phase out classical algos (NSA, GHQC, Australia) or will shortly (EU). Orgs like ETSI will transition to PQC in upcoming revisions (5G) and next gen systems (6G). Just to name a few.
In general, looking at agencies, nations pushing for PQC algorithms and transition time plans seems to be curiously well aligned. All using the NIST PQC algos, no real mentioning of hybrid solutions. Australias plan to ban the use of SHA-256 is one that sticks out a bit from the others.
Regarding time plans, I find the suddenness, almost haste to be quite interesting. I've been searching quite extensively to gather info for PQC talks, and the motivation for a stated time plan are rarely well motivated. Much hand waving. And what drove the rush? I'm suspecting a bit of a crowd panic. But there is a clear change around 2022, 2023. Suddenly the PQ threat became very important. And within 10 years (or less) the world will be using PQC basically everywhere PKI is used besides legacy systems where lifetimes and update issues makes the change unrealistic.
And are we really sure the schemes used will be hybrid? If DJB is right, NIST is pushing for PQC only. The NSA recommendations that was released last year AFAIK does not state anything about hybrid solutions. Details are less clear what NATO has transitioned to but seems to be PQC only. And the ETSI push for PQC in 5G, 6G does not seem to aim for hybrid solutions. I would love to be told to be wrong about the use of hybrid schemes.
The iMessage protocol, Signal and things happening in IETF goes for hybrid schemes, which I think is the right way (if that count for anything). So it seems we have the open world and private sector versus governments, orgs fairly close to governments when it comes to hybrid vs PQC only.
But maybe this is actually the reason behind the presentation: he sees this vast effort to transition to PQC, and he's wandering whether it is worth it or not.
And for what is worth, I'm also a bit skeptic of the maturity of PQ cryptosystem, as they tend to be much more complicated than the classical counterpart and didn't receive as much scrutiny. What happened to Rainbow (layered UOV) is a cautionary tale.
Yes, but that is why I say that his argument is moot. It is probably not worth going to PQC, and it will probably expose us to problems. But it is too late to try and get people to understand, agree on this.
The world has decided that the percieved risk makes it worth going to PQC and we can't change that fact no matter the nice pictures of Schwerer Gustav. We are in full transition mode and will not go back. Even if CRQC never materialize.
We probably will experience a number of issues, problems that we wouldn't have by not going to PQC. Driving costs, making things vulnerable. More work for us I guess.
> I find the suddenness, almost haste to be quite interesting.
> But there is a clear change around 2022, 2023.
I think that's probably because the NIST competition [1] to choose their standard algorithms really started to heat up then.
NIST has a very large gravity well in the academic and industrial cryptographic community, so as soon as it became clear which algorithms NIST would pick (they chose Kyber / ML-KEM and Dilithium / ML-DSA), the (cryptographic) world felt it could start transitioning with much more certainty and haste.
Yes, that is one aspect, and when the drafts was published you could see orgs started running (I've got a nice timeline in my slides). But I still find the haste interesting. There is very little time for the transitions compared to the adoption rate of other crypto standards. The NIST algos are imho still quite immature, which is one big motivation for hybrid schemes.
A bit off topic, as a European, what is happening with DOGE, slashing funding for CISA, TAA etc, I'm seriously worried about NIST. As you say, NIST is very important in many areas. For USA, with things like the coordintated universal time normal. But also for federal cybersec standards that have led to interop with the rest of the world cryptographically. Will NIST be slashed, and if so will the crypto department be spared? If not, what would remain? New standards, the validation program? Will Falcon become a standard, or for that matter the new lightweight symmetric algo based on Ascon? (For which I'm eagerly waiting for NIST to publish test vectors so that I'm able verify that my implementation is compliant.)
I think the haste is probably down to a risk calculation. If practical quantum breaks of classical crypto don't materialise in the next 5-10 years, "all" that's happened is we've cycled onto a new cypher suite sooner than we otherwise would have.
The reverse picture, where they do and we haven't, is so colossally damaging that it doesn't matter if the probability of quantum breaks landing is actually quite small. In expected value terms we still come out ahead.
You don't need to assume that someone in an NSA lab has already demonstrated it for this to work out, and you don't need to assume that there is ever a practical quantum computer deployed for this stuff. All you need is for the probability to be above some small threshold (1%? 5%? I could believe something in that range) to make running for the exits the right move today.
> Because the current plans aren't to migrate to just hybrid classical+PQC schemes, the plans are to migrate to PQC fully. Discarding both RSA and ECC.
This isn't true. NIST has been saying that, but everyone else just laughs and implements hybrid since throwing out RSA/ECC is so obviously stupid.
If you have references to nations, governments that state that transition to hybrid I would love to get references. The EU transition will not be hybrid. The NSA plan is not hybrid. ETSI is not hybrid.
My view is that IETF and commercial entities such as Apple, Google and open source world are the ones going hybrid. In this case I would love to be wrong.
That is a very relevant point. Add a bit of scare mongering, herd mentality and downplaying of the technical effects, risks, you get the ones setting policies taking a decision to transition - just like everybody else.
When I have seen time estimates, everyone is referring to Mosca's Theorem. This is the idea that "store now, decrypt later", combined with the estimated time until a working quantum cryptanalysis is feasible, and a finite transition time for existing crypto standards and technologies (think update times for long-living tokens like ID cards with certificates) makes the available delay until a change must start quite short.
Some additional facts that may aid the discussion:
- Dilithium + Kyber is faster than ECDSA + ECDH on the same hardware. Depending on the platform, it can be up to 33% faster.
- Most commercial entities are implementing hybrid, but in concatenation, not layered, mode. The classical inclusion is mainly for compatibility with "legacy" systems.
As long as the speed difference isn't orders of magnitude, and you are doing many, many session inits, I don't see this to be a real argument. For embedded systems, the difference is indeed at least 10x. As we observed running Dilithium on RV32I on the Tillitis Tkey (https://tillitis.se/). EdDSA takes about a second, which is ok for a single signature. ML-DSA on the same platform is ~20 seconds. But yes, if you are a web server and don't scale automatically with number of sessions, the better performance is good.
The negative thing for all systems with the current NIST PQC algoritms are the longer keys, which gets even worse when going hybrid. See the experiments by Cloudflare for example with PQC in TLS, and adding PQC keys in certificates.
I agree that for certificates, there is a possibility of being compatible with legacy systems by adding the ML-DSA signature as an extension (and the legacy system being able to handle the larger certs). But please show references for the main reason for hybrid is compatibility. IF we look at the motivation for the TLS 1.3 hybrid scheme it states:
"The primary goal of a hybrid key exchange mechanism is to facilitate the establishment of a shared secret which remains secure as long as as one of the component key exchange mechanisms remains unbroken."
That page states that backwards compatibility maybe is one of several possible additional goals. But it is not the main goal.
The problem PQC algorithms must solve is not only to be resistant to attacks by future quantum computers but also against attacks on classical computers. And the hardness in terms of security as the number of bits in the key must scale about as fast as classical algorithms. And work as about well a classical algorithms on classical computers.
All ciphers have warts. The ones we use are the ones we think are secure, but also have warts we can live with. RSA scales slowly with number of bits (and have other warts). EC scales faster and becomes faster than RSA for the same strength, but has other warts. McEliece seems like a good, conservative PQC algorithm, but those keys... TDEA was deemed to provide a too low security margin, but it also was to slow (48 rounds) and with a too small block size.
"Reading Gutmann I see a disconnect from reality. Yes a cryptographically relevant quantum computer may never materialize, nor a useful quantum computer at all. [...] But it doesn't matter."
I hear you saying this: Gutmann's factual points may be correct. But he is discussing quantum computing. Quantum computing is so irrelevant to post-quantum cryptography that even mentioning it in that context makes Gutmann seem disconnected from reality.
I don't necessarily disagree. I'm just trying to make sure I understand.
Good question. I may have not explained very well. Let me give it a try.
1. I read the first part, with Gustav Schwerer etc as an argument against CRQC to ever become possible, so moving to PQC is not needed.
2. I read the second part, about the hardness of attacking classical algorithms, for example the DES cracker as a motivation against attacks on crypto being an actual threat vector. Which ties in to point 1 as a CRQC would be very expensive and hard to use in practice. It is not a computer but a physics experiment.
3. He then talks about real threats and how they don't change very much. Points to OWASP top ten etc.
I totally agree with him on point 1 and 2. I'm just as skeptical. But what I'm saying is that it is too late. We are quite possibly switching to algorithms that will never add any security. Spending huge resources and pushing unneeded changes to systems around the world. Telling the world that it is unnecessary will not change that fact.
Coming to point 3. I don't agree with him. Yes, OWASP top 10 shows the same more or less trivial attack are the ones being used. Nobody use a zero day unless it is needed. If I can become sysadmin through a reused password or a misconfiguration of Teams, why use something more advanced? But I see him using this as an argument against ciphers being broken a real threat vector, and that is a different attack.
The first one is an active attack against a system. It may be a nation state actor that wants to infiltrate, get a persistent access, exfiltrate and possibly destroy the system. It may be a ransomware organization. They will do the same thing, but their timeline is much shorter. (and of course the difference between a nation state and organized crime can be very blurry).
But recording Internet traffic, and over long time (days, months, years, decades) try to decrypt it are solely of interest to a nation state. And for that attack and end game, what OWASP top ten looks like is totally irrelevant. It is not an active attack against a system. It is done in secrecy by entities with a lot of patience by entities that have huge resources. For them Quantum Computers are very interesting and relevant to discuss. But not in public.
I guess that was a waay to long answer. But I hope it explained what I ment.
> NSA: that they exist primarily to secure budget for NSA. Given that, the one thing you absolutely don't want to do is have a system that is breakable only at almost-implausible cost
I’m tempted to draw an analogy to the space race of the 60s, and the current AI panic
I'm with you, but the government (at least in the US and UK) should definitely be spending more time figuring out how to patch reliably, and little less on PQC.
Perhaps the other side of the coin Peter calls "churn" is "updated software". Government, like every other old organization, has a ton of legacy systems. Even if CRQC is a pipe dream, having an excuse to tell everyone across the board they need to update their stuff sometime in the next decade might be a net win.
... you might be worried about other things like postal interception, or physical security of your devices. You can never order any electronics or anything online, or use couriers services if you are person of interest for state level spooks like FBI, CIA. Tech spooks making personal visits to your home, phone or computers in-person also possible.
State SIGINT is generally a lot easier than many of those actions. State SIGINT can especially if the state you’re primarily concerned about isn’t the one you live in.
For many foreign states like US and China its's easier to them to intercept and alter packages going trough borders. Both legally and logistically.
It has already been documented how CIA and FBI have permanent presence in main courier airport hubs. It's easy for Amazon, FedEx or USPS to divert packet or letter trough convener belt that goes trough government areas.
Same for Chinese of course. Anything ordered from China with your name or address cant' be trusted if you believe you might be sufficiently important person of interest for them.
nice analysis. fully agree with you. all the crypto in the world does nothing if i can stream me your framebuffer via some gpu flaw, bring my own broken ass driver on your platform and take continual screenshots of what you so keenly decrypted for me, or sit in your baseband because the java running on your simcard was well... java (and protected with 0000/2580/1337 ♡).
there are so many roads to rome, intelligence communities i gather have taken also to more open roads than breaking any type of crypto. (not to say they dont do that anymore) If its more operationally useful, so they will be doing it. That's a given.
Yeah, I don't think anyone told him about WindsorGreen, the RSA-cracking supercomputer that IBM built for the NSA. RSA-1024 probably should be assumed unsafe at this point.
Using the largest number factored as a benchmark for progress in quantum computing is like evaluating floor(f(0)) = 0 and floor(f(1)) = 0 and concluding f(x) = 0. You can't distinguish f(x) = 0 from f(x) = x/2 from f(x) = e^x/3 when your test is too coarse.
If you want to see the progress in quantum computing today, pay attention to component reliability. For example, in 2014, the best rep code run on a quantum computer had a 1% error rate per round [1]. In 2024, it was 0.00000001% and it had become possible to run full quantum codes with a 0.2% error rate per round [2]. If it takes another decade for that 0.2% to go down by a factor of a thousand, then you've got lots of time. Maybe you'll be dead before progress exceeds the coarseness of factoring. If it takes a year instead of decade (because the cost of error correction is frontloaded) then, well, pay appropriate attention to that.
I think the author was just poking fun at the idea that the 15 and 21 factoring results can be used to indicate some sort of progress. I think the graph makes that obvious.
Last I heard we were 1-2 orders of magnitude away from a physical qubit error rate low enough for error correction to have a chance of creating a threat against cryptography. So things sound pretty unlikely at the moment. We need a fundamental breakthrough.
Even if that's a few years, that's the progress until we get 1 (useful) qbit. We're then another 3 orders of magnitude from being able to factor a number that can't be factored on a classical computer. I don't think it's impossible, but I do think it's very much in the ~2 decades away stage.
Not an expert, but IIUC current quantum computers probably could "factor" numbers in the ~10k range (but doing so wouldn't be that interesting since they could only do so without the error correction that will be incredibly necessary to making Shor's algorithm work in practice)
"Are there any known cases of a real-life attacker ever using
Spectre, Rowhammer, POODLE, etc?"
This is a lot like the Y2K problem, in that nothing particularly bad happened because a lot of people spent a lot of time and money on fixing things. That doesn't mean we shouldn't take them seriously.
It's highly misleading context. Leaking tiny amounts of information a tiny fraction of the time can be totally disastrous in the context of cryptography.
One of my favourite recent examples (2020): "LadderLeak: Breaking ECDSA
With Less Than One Bit Of Nonce Leakage" https://eprint.iacr.org/2020/615.pdf
But I think the issue still stands, we keep hearing about timing attacks, spec-ex attacks, partial nonce reveal, etc. and yes you should carefully design around them, but they're pretty much overhyped and there's easier ways to attack systems. The only timing attack that's been seen in the wild is the MAC bypass in XBox 360. The reality is it is too difficult to launch these attacks and the cryptography is rarely attacked unless there's straightforward flaws.
As for the partial nonce, I've seen many attacks on this kind of issue, but I've yet to see this mistake made.
There aren't many in-the-wild timing attacks because constant-time is table stakes for cryptography implementation, and has been for decades.
It's like saying that when driving on a road, the "wheels fall off" risk is overhyped. Wheels falling off may be low on your personal list of concerns, but only because car manufacturers make sure they don't design cars whose wheels fall off. A car with wheels that fall off would be unacceptable.
> There aren't many in-the-wild timing attacks because constant-time is table stakes for cryptography implementation, and has been for decades.
That is one theory and plausible, but another plausible theory is the conditions to setup a timing attack is usually infeasible due to requiring high precision timing, a very large number of samples/measurement, and requiring probes in an ideal situation. AES in software is still a good example of an algorithm where timing attacks are still possible. Take for example djb paper on cache timing [1], it requires 4 million samples before key recovery. The reality is that kind of attack is highly unlikely to occur, and there's been opportunities where non timing safe AES has been used. I'd argue the more pragmatic stance of an attacker is that that timing attacks aren't really practical unless attacking a protection mechanism of a piece of hardware like the Xbox 360, at least for network services it is really low on the list.
As for the ECDSA attack, again impressive feat, but only 773 wallets which isn't real cause for concern. I think the bigger concern is using amateur 3rd party implementations of bitcoin that doesn't do nonce generation properly when signing, it really isn't that hard, but I agree it can be a footgun. The reality is this is still a negligible concern compared to real attacks happening everyday draining crypto wallets due to malware threat.
A computer can do something 4 million times in the blink of an eye - even over a network if you can pipeline your requests (especially if you can get a VPS instance in the same datacentre as your victim). The only reason we have to resort to fiddly attacks like cache timing is because the low-hanging fruit (e.g. data-dependent branching) has already been covered.
If you can point me to somewhere non-timing-safe AES is in active use in an adversary-interactive setting, please let me know so I can take a look!
"Take, for example, the simple case of a linear congruential generator (LCG), which is the typical textbook introduction to PRNG implementations. LCGs are to PRNG what ROT13 is to encryption and “1234” is to secure passwords. Despite that, due to their simplicity and popularity, they are the default choice for many non-critically secure applications" [0]
As a hacker, always try the easiest things first. It's crazy how often they work.
I appreciated calling out the well-known job role known as: "Fly from one exotic location to another and argue over which post-physics-experiment algorithm is the most cromulent."
Every tech company maintains a select cadre of hot-air specialists whose chief responsibility is to keep the catered lunch warm with an endless cycle of self-important discourse.
>"These are physics experiments, not computers. Claiming that it’s a computer misrepresents what we’re really working with. Every time you see “quantum computer” mentally substitute “physics experiment”, which is what’s actually being discussed."
This needs to be a disclaimer on every discussion of "quantum computing."
This seems like a rather pedantic distinction unless you go with "Computer = von neumann architecture", in which case the category becomes rather strict (Are FPGAs/analogue computers physics experiments too?).
If we assemble a physics experiment to obtain a result that exists at a meaningful level of abstraction from the physical processes themselves (like factoring a prime number using quantum processes, or adding numbers using transistors), it seems appropriate to call this experiment a computer.
Were the first point-contact transistors that were developed in the 1940s at Bell Labs and that consisted of a piece of gold foil stretched over a plastic wedge which was then cut by hand at the tip and pressed against a block of germanium stacked on a metal base, a computer or an experiment?
Experiments to show that solid-state materials were capable of amplifying or switching one electric voltage/current with another.
I'm not disputing that these later became the building blocks for computers, just as the quantum gate experiments being conducted in laboratories today may become the building blocks for quantum computers in the future. But that's not where we are now.
Then i wouldn't call them computers, just like i wouldn't call experiments with single quantum gates computers. But if a physics experiment utilizes quantum processes to factorize a number (even if it's only the number 21), then I'd call that a computation, and the experimental setup a computer.
While it's possible that CRQC (cryptographically relevant quantum computers) are vaporware that will never be made practical, now that were have some PQC algorithms might as well hybridize them.
The only downside of the PQC algos is that they don't nicely fit inside packets. So just design newer protocols that don't rely on discrete packets, most of them do that already.
> They also describe an attack against 11-round AES-256 that requires 2^70 time—almost practical.
>> AES is the best known and most widely used block cipher. Its three versions (AES-128, AES-192, and AES-256) differ in their key sizes (128 bits, 192 bits and 256 bits) and in their number of rounds (10, 12, and 14, respectively).
>> In the case of AES-128, there is no known attack which is faster than the 2^128 complexity of exhaustive search. However, AES-192 and AES-256 were recently shown to be breakable by attacks which require 2^176 and 2^119 time, respectively.
> They also describe an attack against 11-round AES-256 that requires
> 2^70 time—almost practical.
But... nobody uses 11-round AES-256. And, crucially, these are related-key attacks, not practical for, say, breaking TLS.
In 2009, a new related-key attack was discovered that exploits the
simplicity of AES's key schedule and has a complexity of 2^119. In
December 2009 it was improved to 2^99.5... However, related-key attacks
are not of concern in any properly designed cryptographic protocol, as
a properly designed protocol (i.e., implementational software) will
take care not to allow related keys, essentially by constraining an
attacker's means of selecting keys for relatedness.
(Note that the attack with time complexity 2^99.5 also requires 77 bits of memory, or ~16 ZiB, which is, um, billions of terabytes of RAM? edit: actually, this is 2^77 blocks worth of memory, so add a couple more orders of magnitude.)
To date, the best unconditional attack on any full variant of AES provides a factor of ~4 speedup, although it requires 9 PB of data just for AES-128.
Sure, but that inequality of meaning would have to lead to a 'Therefore I conclude this specific, highly infeasible, self-contradictory secret exists' - which is perhaps a common problem with arguments for religion.
I'm confident there's fairly mundane multi-generational secrets, without having to summon the illuminati or knights templar. Either way it doesn't negate the interest in having a technology that could provide that.
Cryptography isn't a technology for keeping secrets, its a technology for keeping secrets in transit. Its not particularly useful for keeping multigenerational secrets (how do you do key management over 100 years?)
> Is your suggestion that key rotation is a necessary requirement?
If you want your secret to last more than one human lifetime, you have to enroll new people into the system somehow.
My main argument would be that cryptography is mostly useless in such a scenario. It makes much more sense to put the secret in a filing cabinet, put a lock on the filing cabinet, and if you are really paranoid, maybe hire some people with guns to guard it. Cryptography for such a scenario is the sort of thing that happens in movies not real life.
And even if cryptography was used, it doesn't seem like public-key would be very applicable at all, so pcq is extra irrelavent.
Diplomatic communications about how you plan / succeed at undermining allies. Or communications about atrocities you knew were happening, but decided to ignore.
There is plenty of reason to want to keep diplomatic and military communications secret for a long time.
> Diplomatic communications about how you plan / succeed at undermining allies. Or communications about atrocities you knew were happening, but decided to ignore.
>There is plenty of reason to want to keep diplomatic and military communications secret for a long time.
I don't think that makes sense. Why would you want to keep implicating communications around for 100 years? Wouldn't you just destroy them?
Cryptography isn't useful for secrets you want nobody to know. Its useful for secrets you want some people to know but not others.
That said it also seems questionable how much people care about atrocities hundred years after the fact. For example, nobody is boycotting IBM today for their role in the holocaust.
News of these things does come out from time to time, usually over a shorter time period, and these create embarrassment, shock, pain and anger, but has any had significant substantive consequences? Here is a hypothetical one to consider: FDR secretly informed Hitler that the US would support an invasion of the USSR - how far would be the consequences of such a revelation reach, if it were revealed today?
It's not so much about the impact of the secrets leaking. Instead, its about the impact on communications if diplomats need to worry about their communications leaking.
Because that was the number the person i was responding to gave.
In any case north korea has the bomb. I think the secret is out. The most difficult thing at this point is the engineering challenge not the book knowledge.
I was under the impression that information about how to build nukes was mostly well known by most countries, and it is just a matter of getting enough of the right type of uranium or whatever.
My genetic data will be relevant even after I'm dead because my children and grandchildren share it with me. And it's a modern kind of data that didn't exist in 1925.
I hate to tell you, but even if you have never done 23 and me or anything similar, enough of your family has that your genetic data is already very readily accessible to the parties who need it.
Realistically you cant keep that secret though. There are a lot of people who share enough of your dna to reconstruct parts of it. Possibly hundreds. And all it takes is a hair folicule or spit.
You are never keeping that secret against an interested adversary.
I think it is worthwhile to explore cryptography that is not based on the discrete logarithm problem. Currently, we're keeping all eggs in one conjectured basket. Even if quantum computing will never be viable, there is a non-zero chance that the discrete logarithm problem will be solved in some other way.
There is a non-zero chance of a black hole forming in the LHC and swallowing the Earth, yet we don’t have a backup plan for that. There are an infinite more examples like that. A non-zero chance is not an argument for anything, unless you have a good basis on actually quantifying it.
I get what they are saying: There is a difference between theoretical and applied.
I think the OWASP/NIST/InfoSec has always been a bit behind because of this mentality. I think there is a progressive forward looking mindset that is often seen as "mad" or "unhinged" when it's ultimately throwing paint at a wall to see what sticks.
The driver is curiosity but then someone comes along and applies CBA and ROI, and CAC...the person who was curious has left because that wasn't the goal. Eventually something will stick that meets all of those mainstream ideas.
If you think of the body as a computer, it communicates through DNA, a much larger scale of information passing. Binary is just arbitrarily selected because it was there. Should we stop exploring binary computational systems? No but we also don't need all our eggs in one basket.
This smalls a lot like wishcasting masquerading as critique. I just attended a PQC conference, so I'm biased, but this paper's author makes a lot of very strong claims about the infeasibility of future attacks developing that most experts in the field would disagree with. There is a hope that this is all a fire drill and RSA/EC will survive the next decade unscathed, but there's also plenty of evidence to suggest that incremental improvements in PQ compute will eventually reach their goal. Rather than a big cannon, I see it looking more like AI/LLMs, lots and lots of small incremental improvements by researchers were needed to eventually yield some significant advancements. I pray Post Quantum computing stays in the realm of Cold Fusion, but I'm not about to believe it.
Cryptanalysis has already made a few strides on breaking RSA more quickly, and I've heard from noted cryptanalysts the claim there's a significant chance RSA will be further broken in the next decade regardless of PQ. It's a scary take to double down on RSA of all things.
Antoine Joux was on the side of classical cryptanalysis on a 2014 bet. This was right after the small-characteristic discrete log advances, so that might no longer be the bet if it was made today.
I don't recall their name, they were a guest on the Root Causes podcast discussing PQ topics, though your summary varies from what I was trying to express. It's not that RSA will be classically broken, but that novel attacks to reduce factoring times of RSA key like batch attacks have a statistically significant chance of being discovered, that "further" was not meant to imply "completely broken classically," but "weakened further using classical approaches". Sounded plausible to me, though that's not a thing I'm any kind of domain expert in.
Sorry I'm busy, might have time to spend to look the podcast up in a day or two, but I don't think there's any actual value in that to anyone over an offhand comment so forgive me if I find other things to do instead.
one thing I agree that software/standard people like to churn, churn is a source of new income regardless if QC could break RSA or not in the future, people could make profit now
the main takeaway is complexity is the enemy of security; we could all agree with this no? ... anyone ever opened the thousands of pages bluetooth standard?
this is a really cool presentation to read regardless if people agree with it or not. regarding the QC stuff ... i no idea.
The NSA may not be breaking RSA right now but I'd bet good money they are storing some traffic for later. So the whole factorization isn't worth it argument feels silly.
Man, some real "Cynicism is the intellectual cripple's substitute for intelligence" energy here. Seems unnecessary given what I read of Gutmann's history.
I get it must be annoying to be someone working in cryptography and always be hearing about QC when there are endless security issues today. It must be tiring to have all these breathless pop-science articles about the quantum future, startups claiming ridiculous timelines to raise money on hype, and business seminars where consultants claim you'll need to be prepared for the quantum revolution changing how business works. I feel the same way.
But you shouldn't let that drive you so far in the opposite direction that you're extrapolating fun small quantum factoring experiments from a decade ago to factoring 1024 bit keys in the year 4000. Or say things like 'This makes the highly optimistic assumption that quantum physics experiments scale linearly... the evidence we have, shown by the lack of progress so far, is that this is not the case'. If we get fault tolerant QC of course it scales linearly and it seems embarrassing as a computer scientist to not understand the difference between constant and asymptote. "Actually, quantum computers are so new and untested that they really qualify as physics experiments"... yeah? And?
None of this is to say that fault-tolerant highly scalable QC implementing Shor's algorithm is just around the corner, I truly believe it's not. But the world of QC is making really interesting advances running some of the coolest experiments around and I find this superior Hossenfelder-like cynicism in the face of real science making real progress so so tiring.
It's strange to see so many negative responses that start with vague emotional language. It's almost as if a lot of critics didn't read the presentation. Or maybe they think the rest of us didn't read it.
I read the whole presentation. The physics experiment criticism Guttman makes that I referred to is at page 16/30. Nothing after that engages with QC to the extent that the first half of the presentation did, so I didn't refer to later parts.
Good question. Yes it is strange, because the information on the slides is mainly numerical. For example the integers "15" and "21" and the years "2002" and "2012" don't pack much of an emotional charge for me. I suspect they wouldn't for most people.
One major point of the presentation here is that it's not making real progress. People are still publishing papers, but they have done nothing with an effect outside their little community. It's been in roughly the same state for the last 10 years. For a minimum of 30 years, there have been promises of amazing things coming in the next decade in QC. After how many decades should those predictions lose credibility?
There is real opportunity cost to doing this stuff, and real money getting sucked up by these grifters that could be spent on real problems. There are real PhD students getting sucked down this rabbit hole instead of doing something that actually does make progress. There is a real cost to screwing around and making promises of "next decade."
> One major point of the presentation here is that it's not making real progress.
How are you measuring "real progress"?
> People are still publishing papers, but they have done nothing with an effect outside their little community.
Having an effect outside the research community is essentially a Heaviside function. Before the field is mature enough, there is no effect outside, but once the field is mature enough, there is an effect outside. Makes it hard to judge if there is any progress or not.
The field has had 40 years of maturing. Experimentation on QC started in the 1980's. At what point are we going to be factoring numbers or (more realistically) simulating chemical interactions?
Real progress in this field is very easy to measure. It's based on number of effective qbits of computation. That is just a metric where QC is failing to deliver so badly that everyone in the field wants to deny its existence.
Unfortunately, the level of investment in QC is very much outsized compared to the level of progress. These things should rise at the same time. More promising areas of science can get the investment that is otherwise being sucked into QC.
> Having an effect outside the research community is essentially a Heaviside function.
This is something that people like to say but is never true. Impact on the outside world for new technologies is almost always a sigmoid function, not a heavyside function. You should see some residual beneficial effects at the leading edge if you have something real.
I agree! People who predicted QC soon over the last few decades should lose credibility. They were wrong and they were wrong for no good reason. There is a real opportunity cost to focusing on the wrong thing. There are definitely grifters in the space. Responsible QC researchers should call it out (e.g. Scott Aaronson).
But it doesn't necessarily follow that you can dismiss the actual underlying field. Within the last five years alone we've gone from the quantum supremacy experiment to multiple groups using multiple technologies to claim QEC code implementations with improved error rates over the underlying qubits. People don't have to be interested in these results, they are rather niche (a little community as you put it), but you shouldn't be uninterested and then write a presentation titled 'Why Quantum Cryptanalysis is Bollocks'.
Well, when the little community circles the wagons around the grifters instead of excising them, the rest of us get to ask questions about that community. The cold fusion community did the same thing for several decades, too.
And by the way, about 0.01% of the grifters in the QC space are getting called out right now.
Not working for years upon years, and suddenly it works, and if it does it can be a huge problem. I don't know if I trust PQC though, but that only means that more research on it is needed.
- Crypto building blocks are important basic research because it underpins everything.
- Good crypto (this exists btw) is impossible to beat, unless QC is available. That's why PQC is being researched. Think about what kind of crypto NSA wants to break, it's not your bank of america passwords.
- IDK why this guy thinks we need to shut down Los Alamos to do crypto, does he not think the NSA has datacenters of its own?
- The problem with "well, it's not a problem now, why are we preparing for it" is that nation states are storing everything that is going on in the internet, waiting for when QC becomes active. This essentially means you can assume every secret you have will not be secret in 10/20/50 years. Your password is probably fine, but if you sent secret diplomatic cable today, it might be unlocked for your adversaries some years later. These secret nation-state comms are designed to be unlocked after N years normally since keeping them secret forever is expensive; PQC is simply designed to withhold that number N.
- The NSA is generally known to be several decades ahead of the academia. They infamously knew and corrected a differential cryptography vulnerability in DES long before differential cryptanalysis was known in the public community. Saying QC isn't growing fast enough doesn't mean much.
- The 2008 financial crisis metaphor is the only one that seemed poignant
Love me some Peter Gutmann. This is classic Gutmann.
A lot of this presentation is mooted by understanding PQC as an scientific question rather than an engineering one. What are the precise natures of quantum-superior attacks on cryptosystems and what are key establishments and signatures that resist those attacks? Whatever else you think of quantum cryptanalysis those are undeniably important theoretical questions.
A few more slides are mooted by the likelihood that any mainstream deployed PQC system is going to be hybridized with a classical cryptosystem.
As an articulation of a threat model for modern computing, it simultaneously makes some sense and proves too much: if you think OWASP-type vulnerabilities are where everyone's head should be at (and I sort of agree), then all of cryptography is a sideshow. I'm a connoisseur of cryptographic vulnerabilities that break real systems the way SQL injection does (a bitflipping attack on an encrypted cookie, a broken load-bearing signature scheme) but even I have to admit there's 1 of those for every 10,000 conventional non-cryptographic attack.
But of course, it also depends on who your adversary is. Ironically, if you're worried about state-level SIGINT, the barrier for OWASP-style attacks may be higher than that of large-scale codebreaking; passive interception and store-now-decrypt-later is the SIGINT love language.
My biggest thing with all of this is a core belief about organizations like NSA: that they exist primarily to secure budget for NSA. Given that, the one thing you absolutely don't want to do is have a system that is breakable only at almost-implausible cost.
(Also, his RSA-1024 analysis is off; it's missing batch attacks).
Reading Gutmann I see a disconnect from reality. Yes a cryptographically relevant quantum computer may never materialize, nor a useful quantum computer at all. These are interesting research topics and interesting discussions. But it doesn't matter. Using that as arguments against PQC is moot, no longer relevant.
The decision to move the world to PQC has already been taken, and we are in full transition mode. Orgs like NATO has transitioned. Many have published time plans to phase out classical algos (NSA, GHQC, Australia) or will shortly (EU). Orgs like ETSI will transition to PQC in upcoming revisions (5G) and next gen systems (6G). Just to name a few.
In general, looking at agencies, nations pushing for PQC algorithms and transition time plans seems to be curiously well aligned. All using the NIST PQC algos, no real mentioning of hybrid solutions. Australias plan to ban the use of SHA-256 is one that sticks out a bit from the others.
Regarding time plans, I find the suddenness, almost haste to be quite interesting. I've been searching quite extensively to gather info for PQC talks, and the motivation for a stated time plan are rarely well motivated. Much hand waving. And what drove the rush? I'm suspecting a bit of a crowd panic. But there is a clear change around 2022, 2023. Suddenly the PQ threat became very important. And within 10 years (or less) the world will be using PQC basically everywhere PKI is used besides legacy systems where lifetimes and update issues makes the change unrealistic.
And are we really sure the schemes used will be hybrid? If DJB is right, NIST is pushing for PQC only. The NSA recommendations that was released last year AFAIK does not state anything about hybrid solutions. Details are less clear what NATO has transitioned to but seems to be PQC only. And the ETSI push for PQC in 5G, 6G does not seem to aim for hybrid solutions. I would love to be told to be wrong about the use of hybrid schemes.
The iMessage protocol, Signal and things happening in IETF goes for hybrid schemes, which I think is the right way (if that count for anything). So it seems we have the open world and private sector versus governments, orgs fairly close to governments when it comes to hybrid vs PQC only.
But maybe this is actually the reason behind the presentation: he sees this vast effort to transition to PQC, and he's wandering whether it is worth it or not. And for what is worth, I'm also a bit skeptic of the maturity of PQ cryptosystem, as they tend to be much more complicated than the classical counterpart and didn't receive as much scrutiny. What happened to Rainbow (layered UOV) is a cautionary tale.
Yes, but that is why I say that his argument is moot. It is probably not worth going to PQC, and it will probably expose us to problems. But it is too late to try and get people to understand, agree on this.
The world has decided that the percieved risk makes it worth going to PQC and we can't change that fact no matter the nice pictures of Schwerer Gustav. We are in full transition mode and will not go back. Even if CRQC never materialize.
We probably will experience a number of issues, problems that we wouldn't have by not going to PQC. Driving costs, making things vulnerable. More work for us I guess.
> I find the suddenness, almost haste to be quite interesting. > But there is a clear change around 2022, 2023.
I think that's probably because the NIST competition [1] to choose their standard algorithms really started to heat up then.
NIST has a very large gravity well in the academic and industrial cryptographic community, so as soon as it became clear which algorithms NIST would pick (they chose Kyber / ML-KEM and Dilithium / ML-DSA), the (cryptographic) world felt it could start transitioning with much more certainty and haste.
1. https://csrc.nist.gov/projects/post-quantum-cryptography/pos...
Yes, that is one aspect, and when the drafts was published you could see orgs started running (I've got a nice timeline in my slides). But I still find the haste interesting. There is very little time for the transitions compared to the adoption rate of other crypto standards. The NIST algos are imho still quite immature, which is one big motivation for hybrid schemes.
A bit off topic, as a European, what is happening with DOGE, slashing funding for CISA, TAA etc, I'm seriously worried about NIST. As you say, NIST is very important in many areas. For USA, with things like the coordintated universal time normal. But also for federal cybersec standards that have led to interop with the rest of the world cryptographically. Will NIST be slashed, and if so will the crypto department be spared? If not, what would remain? New standards, the validation program? Will Falcon become a standard, or for that matter the new lightweight symmetric algo based on Ascon? (For which I'm eagerly waiting for NIST to publish test vectors so that I'm able verify that my implementation is compliant.)
I think the haste is probably down to a risk calculation. If practical quantum breaks of classical crypto don't materialise in the next 5-10 years, "all" that's happened is we've cycled onto a new cypher suite sooner than we otherwise would have.
The reverse picture, where they do and we haven't, is so colossally damaging that it doesn't matter if the probability of quantum breaks landing is actually quite small. In expected value terms we still come out ahead.
You don't need to assume that someone in an NSA lab has already demonstrated it for this to work out, and you don't need to assume that there is ever a practical quantum computer deployed for this stuff. All you need is for the probability to be above some small threshold (1%? 5%? I could believe something in that range) to make running for the exits the right move today.
How does the calculation look like if the thing we migrate to ends up being broken way easier than classical algorithms?
Because the current plans aren't to migrate to just hybrid classical+PQC schemes, the plans are to migrate to PQC fully. Discarding both RSA and ECC.
> Because the current plans aren't to migrate to just hybrid classical+PQC schemes, the plans are to migrate to PQC fully. Discarding both RSA and ECC.
This isn't true. NIST has been saying that, but everyone else just laughs and implements hybrid since throwing out RSA/ECC is so obviously stupid.
If you have references to nations, governments that state that transition to hybrid I would love to get references. The EU transition will not be hybrid. The NSA plan is not hybrid. ETSI is not hybrid.
My view is that IETF and commercial entities such as Apple, Google and open source world are the ones going hybrid. In this case I would love to be wrong.
> NIST has been saying that, but everyone else just laughs and implements hybrid since throwing out RSA/ECC is so obviously stupid.
The Australian government is also saying this.
That is a very relevant point. Add a bit of scare mongering, herd mentality and downplaying of the technical effects, risks, you get the ones setting policies taking a decision to transition - just like everybody else.
When I have seen time estimates, everyone is referring to Mosca's Theorem. This is the idea that "store now, decrypt later", combined with the estimated time until a working quantum cryptanalysis is feasible, and a finite transition time for existing crypto standards and technologies (think update times for long-living tokens like ID cards with certificates) makes the available delay until a change must start quite short.
Some additional facts that may aid the discussion:
- Dilithium + Kyber is faster than ECDSA + ECDH on the same hardware. Depending on the platform, it can be up to 33% faster.
- Most commercial entities are implementing hybrid, but in concatenation, not layered, mode. The classical inclusion is mainly for compatibility with "legacy" systems.
As long as the speed difference isn't orders of magnitude, and you are doing many, many session inits, I don't see this to be a real argument. For embedded systems, the difference is indeed at least 10x. As we observed running Dilithium on RV32I on the Tillitis Tkey (https://tillitis.se/). EdDSA takes about a second, which is ok for a single signature. ML-DSA on the same platform is ~20 seconds. But yes, if you are a web server and don't scale automatically with number of sessions, the better performance is good.
The negative thing for all systems with the current NIST PQC algoritms are the longer keys, which gets even worse when going hybrid. See the experiments by Cloudflare for example with PQC in TLS, and adding PQC keys in certificates.
I agree that for certificates, there is a possibility of being compatible with legacy systems by adding the ML-DSA signature as an extension (and the legacy system being able to handle the larger certs). But please show references for the main reason for hybrid is compatibility. IF we look at the motivation for the TLS 1.3 hybrid scheme it states:
"The primary goal of a hybrid key exchange mechanism is to facilitate the establishment of a shared secret which remains secure as long as as one of the component key exchange mechanisms remains unbroken."
(https://www.ietf.org/archive/id/draft-ietf-tls-hybrid-design...)
That page states that backwards compatibility maybe is one of several possible additional goals. But it is not the main goal.
The problem PQC algorithms must solve is not only to be resistant to attacks by future quantum computers but also against attacks on classical computers. And the hardness in terms of security as the number of bits in the key must scale about as fast as classical algorithms. And work as about well a classical algorithms on classical computers.
All ciphers have warts. The ones we use are the ones we think are secure, but also have warts we can live with. RSA scales slowly with number of bits (and have other warts). EC scales faster and becomes faster than RSA for the same strength, but has other warts. McEliece seems like a good, conservative PQC algorithm, but those keys... TDEA was deemed to provide a too low security margin, but it also was to slow (48 rounds) and with a too small block size.
"Reading Gutmann I see a disconnect from reality. Yes a cryptographically relevant quantum computer may never materialize, nor a useful quantum computer at all. [...] But it doesn't matter."
I hear you saying this: Gutmann's factual points may be correct. But he is discussing quantum computing. Quantum computing is so irrelevant to post-quantum cryptography that even mentioning it in that context makes Gutmann seem disconnected from reality.
I don't necessarily disagree. I'm just trying to make sure I understand.
Good question. I may have not explained very well. Let me give it a try.
1. I read the first part, with Gustav Schwerer etc as an argument against CRQC to ever become possible, so moving to PQC is not needed.
2. I read the second part, about the hardness of attacking classical algorithms, for example the DES cracker as a motivation against attacks on crypto being an actual threat vector. Which ties in to point 1 as a CRQC would be very expensive and hard to use in practice. It is not a computer but a physics experiment.
3. He then talks about real threats and how they don't change very much. Points to OWASP top ten etc.
I totally agree with him on point 1 and 2. I'm just as skeptical. But what I'm saying is that it is too late. We are quite possibly switching to algorithms that will never add any security. Spending huge resources and pushing unneeded changes to systems around the world. Telling the world that it is unnecessary will not change that fact.
Coming to point 3. I don't agree with him. Yes, OWASP top 10 shows the same more or less trivial attack are the ones being used. Nobody use a zero day unless it is needed. If I can become sysadmin through a reused password or a misconfiguration of Teams, why use something more advanced? But I see him using this as an argument against ciphers being broken a real threat vector, and that is a different attack.
The first one is an active attack against a system. It may be a nation state actor that wants to infiltrate, get a persistent access, exfiltrate and possibly destroy the system. It may be a ransomware organization. They will do the same thing, but their timeline is much shorter. (and of course the difference between a nation state and organized crime can be very blurry).
But recording Internet traffic, and over long time (days, months, years, decades) try to decrypt it are solely of interest to a nation state. And for that attack and end game, what OWASP top ten looks like is totally irrelevant. It is not an active attack against a system. It is done in secrecy by entities with a lot of patience by entities that have huge resources. For them Quantum Computers are very interesting and relevant to discuss. But not in public.
I guess that was a waay to long answer. But I hope it explained what I ment.
> NSA: that they exist primarily to secure budget for NSA. Given that, the one thing you absolutely don't want to do is have a system that is breakable only at almost-implausible cost
I’m tempted to draw an analogy to the space race of the 60s, and the current AI panic
I'm with you, but the government (at least in the US and UK) should definitely be spending more time figuring out how to patch reliably, and little less on PQC.
Perhaps the other side of the coin Peter calls "churn" is "updated software". Government, like every other old organization, has a ton of legacy systems. Even if CRQC is a pipe dream, having an excuse to tell everyone across the board they need to update their stuff sometime in the next decade might be a net win.
> if you're worried about state-level SIGINT
... you might be worried about other things like postal interception, or physical security of your devices. You can never order any electronics or anything online, or use couriers services if you are person of interest for state level spooks like FBI, CIA. Tech spooks making personal visits to your home, phone or computers in-person also possible.
State SIGINT is generally a lot easier than many of those actions. State SIGINT can especially if the state you’re primarily concerned about isn’t the one you live in.
It's not as hard as you may think.
For many foreign states like US and China its's easier to them to intercept and alter packages going trough borders. Both legally and logistically.
It has already been documented how CIA and FBI have permanent presence in main courier airport hubs. It's easy for Amazon, FedEx or USPS to divert packet or letter trough convener belt that goes trough government areas.
Same for Chinese of course. Anything ordered from China with your name or address cant' be trusted if you believe you might be sufficiently important person of interest for them.
"Love me some Peter Gutmann. This is classic Gutmann."
Indeed he don't hold back and is quite funny in his frankness.
Though when it comes to his book draft, he really need an editor. https://www.cs.auckland.ac.nz/~pgut001/pubs/book.pdf
All this "mooting" is just not happening as described above. Anyone can look at the presentation and see that it's not irrelevant.
I've heard his name, but after reading this, I'm calling him a national treasure now.
Right after I launch my new cryptocurrency QuantumDog.
nice analysis. fully agree with you. all the crypto in the world does nothing if i can stream me your framebuffer via some gpu flaw, bring my own broken ass driver on your platform and take continual screenshots of what you so keenly decrypted for me, or sit in your baseband because the java running on your simcard was well... java (and protected with 0000/2580/1337 ♡).
there are so many roads to rome, intelligence communities i gather have taken also to more open roads than breaking any type of crypto. (not to say they dont do that anymore) If its more operationally useful, so they will be doing it. That's a given.
Yeah, I don't think anyone told him about WindsorGreen, the RSA-cracking supercomputer that IBM built for the NSA. RSA-1024 probably should be assumed unsafe at this point.
Using the largest number factored as a benchmark for progress in quantum computing is like evaluating floor(f(0)) = 0 and floor(f(1)) = 0 and concluding f(x) = 0. You can't distinguish f(x) = 0 from f(x) = x/2 from f(x) = e^x/3 when your test is too coarse.
If you want to see the progress in quantum computing today, pay attention to component reliability. For example, in 2014, the best rep code run on a quantum computer had a 1% error rate per round [1]. In 2024, it was 0.00000001% and it had become possible to run full quantum codes with a 0.2% error rate per round [2]. If it takes another decade for that 0.2% to go down by a factor of a thousand, then you've got lots of time. Maybe you'll be dead before progress exceeds the coarseness of factoring. If it takes a year instead of decade (because the cost of error correction is frontloaded) then, well, pay appropriate attention to that.
[1]: https://arxiv.org/abs/1411.7403
[2]: https://arxiv.org/abs/2408.13687
I think the author was just poking fun at the idea that the 15 and 21 factoring results can be used to indicate some sort of progress. I think the graph makes that obvious.
Last I heard we were 1-2 orders of magnitude away from a physical qubit error rate low enough for error correction to have a chance of creating a threat against cryptography. So things sound pretty unlikely at the moment. We need a fundamental breakthrough.
Is 1–2 orders of magnitude a lot or a little here? With exponential progress that could be a few years; with stalled progress that’s maybe never.
Even if that's a few years, that's the progress until we get 1 (useful) qbit. We're then another 3 orders of magnitude from being able to factor a number that can't be factored on a classical computer. I don't think it's impossible, but I do think it's very much in the ~2 decades away stage.
Not into the field, but how far are we from factoring 35?
Not an expert, but IIUC current quantum computers probably could "factor" numbers in the ~10k range (but doing so wouldn't be that interesting since they could only do so without the error correction that will be incredibly necessary to making Shor's algorithm work in practice)
Maybe Shor’s algorithm is just a dead end.
"Are there any known cases of a real-life attacker ever using Spectre, Rowhammer, POODLE, etc?"
This is a lot like the Y2K problem, in that nothing particularly bad happened because a lot of people spent a lot of time and money on fixing things. That doesn't mean we shouldn't take them seriously.
You should include the context:
Fancy crypto attack: You have a 0.00001% change of recovering 2 bits of plaintext from a single message
Any of the OWASP top ten: You have a 100% chance of recovering the plaintext of all the messages
It's highly misleading context. Leaking tiny amounts of information a tiny fraction of the time can be totally disastrous in the context of cryptography.
One of my favourite recent examples (2020): "LadderLeak: Breaking ECDSA With Less Than One Bit Of Nonce Leakage" https://eprint.iacr.org/2020/615.pdf
But I think the issue still stands, we keep hearing about timing attacks, spec-ex attacks, partial nonce reveal, etc. and yes you should carefully design around them, but they're pretty much overhyped and there's easier ways to attack systems. The only timing attack that's been seen in the wild is the MAC bypass in XBox 360. The reality is it is too difficult to launch these attacks and the cryptography is rarely attacked unless there's straightforward flaws.
As for the partial nonce, I've seen many attacks on this kind of issue, but I've yet to see this mistake made.
There aren't many in-the-wild timing attacks because constant-time is table stakes for cryptography implementation, and has been for decades.
It's like saying that when driving on a road, the "wheels fall off" risk is overhyped. Wheels falling off may be low on your personal list of concerns, but only because car manufacturers make sure they don't design cars whose wheels fall off. A car with wheels that fall off would be unacceptable.
Here's an in-the-wild example of weak nonces being used to compromise Bitcoin wallets: https://research.kudelskisecurity.com/2023/03/06/polynonce-a...
> There aren't many in-the-wild timing attacks because constant-time is table stakes for cryptography implementation, and has been for decades.
That is one theory and plausible, but another plausible theory is the conditions to setup a timing attack is usually infeasible due to requiring high precision timing, a very large number of samples/measurement, and requiring probes in an ideal situation. AES in software is still a good example of an algorithm where timing attacks are still possible. Take for example djb paper on cache timing [1], it requires 4 million samples before key recovery. The reality is that kind of attack is highly unlikely to occur, and there's been opportunities where non timing safe AES has been used. I'd argue the more pragmatic stance of an attacker is that that timing attacks aren't really practical unless attacking a protection mechanism of a piece of hardware like the Xbox 360, at least for network services it is really low on the list.
As for the ECDSA attack, again impressive feat, but only 773 wallets which isn't real cause for concern. I think the bigger concern is using amateur 3rd party implementations of bitcoin that doesn't do nonce generation properly when signing, it really isn't that hard, but I agree it can be a footgun. The reality is this is still a negligible concern compared to real attacks happening everyday draining crypto wallets due to malware threat.
[1]: https://cr.yp.to/antiforgery/cachetiming-20050414.pdf
A computer can do something 4 million times in the blink of an eye - even over a network if you can pipeline your requests (especially if you can get a VPS instance in the same datacentre as your victim). The only reason we have to resort to fiddly attacks like cache timing is because the low-hanging fruit (e.g. data-dependent branching) has already been covered.
If you can point me to somewhere non-timing-safe AES is in active use in an adversary-interactive setting, please let me know so I can take a look!
"Take, for example, the simple case of a linear congruential generator (LCG), which is the typical textbook introduction to PRNG implementations. LCGs are to PRNG what ROT13 is to encryption and “1234” is to secure passwords. Despite that, due to their simplicity and popularity, they are the default choice for many non-critically secure applications" [0]
As a hacker, always try the easiest things first. It's crazy how often they work.
[0] https://research.kudelskisecurity.com/2023/03/06/polynonce-a...
Or TSA airport security
I appreciated calling out the well-known job role known as: "Fly from one exotic location to another and argue over which post-physics-experiment algorithm is the most cromulent."
Every tech company maintains a select cadre of hot-air specialists whose chief responsibility is to keep the catered lunch warm with an endless cycle of self-important discourse.
A vital role, truly indispensable.
>"These are physics experiments, not computers. Claiming that it’s a computer misrepresents what we’re really working with. Every time you see “quantum computer” mentally substitute “physics experiment”, which is what’s actually being discussed."
This needs to be a disclaimer on every discussion of "quantum computing."
This seems like a rather pedantic distinction unless you go with "Computer = von neumann architecture", in which case the category becomes rather strict (Are FPGAs/analogue computers physics experiments too?). If we assemble a physics experiment to obtain a result that exists at a meaningful level of abstraction from the physical processes themselves (like factoring a prime number using quantum processes, or adding numbers using transistors), it seems appropriate to call this experiment a computer.
Were the first point-contact transistors that were developed in the 1940s at Bell Labs and that consisted of a piece of gold foil stretched over a plastic wedge which was then cut by hand at the tip and pressed against a block of germanium stacked on a metal base, a computer or an experiment?
What were they used for?
Experiments to show that solid-state materials were capable of amplifying or switching one electric voltage/current with another.
I'm not disputing that these later became the building blocks for computers, just as the quantum gate experiments being conducted in laboratories today may become the building blocks for quantum computers in the future. But that's not where we are now.
Then i wouldn't call them computers, just like i wouldn't call experiments with single quantum gates computers. But if a physics experiment utilizes quantum processes to factorize a number (even if it's only the number 21), then I'd call that a computation, and the experimental setup a computer.
Also, I like how in those experiments they are trying to fight wave function collapse without having any physical model of wave function collapse.
While it's possible that CRQC (cryptographically relevant quantum computers) are vaporware that will never be made practical, now that were have some PQC algorithms might as well hybridize them.
The only downside of the PQC algos is that they don't nicely fit inside packets. So just design newer protocols that don't rely on discrete packets, most of them do that already.
p 11 (/30), makes a terrible case in handwaving.
It ignores the requirement that secret data needs to stay secret for 30 years, or 100 years, or long into the future, and attacks only get better.
https://www.schneier.com/blog/archives/2009/07/another_new_a...
> They also describe an attack against 11-round AES-256 that requires 2^70 time—almost practical.
>> AES is the best known and most widely used block cipher. Its three versions (AES-128, AES-192, and AES-256) differ in their key sizes (128 bits, 192 bits and 256 bits) and in their number of rounds (10, 12, and 14, respectively).
>> In the case of AES-128, there is no known attack which is faster than the 2^128 complexity of exhaustive search. However, AES-192 and AES-256 were recently shown to be breakable by attacks which require 2^176 and 2^119 time, respectively.
(Note that the attack with time complexity 2^99.5 also requires 77 bits of memory, or ~16 ZiB, which is, um, billions of terabytes of RAM? edit: actually, this is 2^77 blocks worth of memory, so add a couple more orders of magnitude.)
To date, the best unconditional attack on any full variant of AES provides a factor of ~4 speedup, although it requires 9 PB of data just for AES-128.
> It ignores the requirement that secret data needs to stay secret for 30 years, or 100 years, or long into the future, and attacks only get better.
What data has to stay secret for 100 years?
To extrapolate backwards, was there anything in 1925 that would be still sensitive today? Its hard to imagine.
"I don't know of any long-lasting secrets" ≠ "There is / will be no need for long-lasting secrets"
The fact you don't know about these might in fact simply indicate the efficacy of the secret keepers.
> "I don't know of any long-lasting secrets" ≠ "There is / will be no need for long-lasting secrets"
This feels like a bad argument for religion.
The point though is not that i don't know any but that i can't concieve of any. I can't even imagine such a scenario, even hypothetically.
Sure, but that inequality of meaning would have to lead to a 'Therefore I conclude this specific, highly infeasible, self-contradictory secret exists' - which is perhaps a common problem with arguments for religion.
I'm confident there's fairly mundane multi-generational secrets, without having to summon the illuminati or knights templar. Either way it doesn't negate the interest in having a technology that could provide that.
Cryptography isn't a technology for keeping secrets, its a technology for keeping secrets in transit. Its not particularly useful for keeping multigenerational secrets (how do you do key management over 100 years?)
Is your suggestion that key rotation is a necessary requirement?
I feel we're coming full circle towards the original discussion about pqc.
(Also, I feel cryptography is very much a tech that can assist you in keeping secrets at rest.)
> Is your suggestion that key rotation is a necessary requirement?
If you want your secret to last more than one human lifetime, you have to enroll new people into the system somehow.
My main argument would be that cryptography is mostly useless in such a scenario. It makes much more sense to put the secret in a filing cabinet, put a lock on the filing cabinet, and if you are really paranoid, maybe hire some people with guns to guard it. Cryptography for such a scenario is the sort of thing that happens in movies not real life.
And even if cryptography was used, it doesn't seem like public-key would be very applicable at all, so pcq is extra irrelavent.
Diplomatic communications about how you plan / succeed at undermining allies. Or communications about atrocities you knew were happening, but decided to ignore.
There is plenty of reason to want to keep diplomatic and military communications secret for a long time.
> Diplomatic communications about how you plan / succeed at undermining allies. Or communications about atrocities you knew were happening, but decided to ignore.
>There is plenty of reason to want to keep diplomatic and military communications secret for a long time.
I don't think that makes sense. Why would you want to keep implicating communications around for 100 years? Wouldn't you just destroy them?
Cryptography isn't useful for secrets you want nobody to know. Its useful for secrets you want some people to know but not others.
That said it also seems questionable how much people care about atrocities hundred years after the fact. For example, nobody is boycotting IBM today for their role in the holocaust.
News of these things does come out from time to time, usually over a shorter time period, and these create embarrassment, shock, pain and anger, but has any had significant substantive consequences? Here is a hypothetical one to consider: FDR secretly informed Hitler that the US would support an invasion of the USSR - how far would be the consequences of such a revelation reach, if it were revealed today?
It's not so much about the impact of the secrets leaking. Instead, its about the impact on communications if diplomats need to worry about their communications leaking.
idk why you're fixated on 100 years, but stuff like nuclear weapons tech is 1940s-1960s technology and that's still classified.
> idk why you're fixated on 100 years
Because that was the number the person i was responding to gave.
In any case north korea has the bomb. I think the secret is out. The most difficult thing at this point is the engineering challenge not the book knowledge.
I was under the impression that information about how to build nukes was mostly well known by most countries, and it is just a matter of getting enough of the right type of uranium or whatever.
And will still be classified in 2045...
My genetic data will be relevant even after I'm dead because my children and grandchildren share it with me. And it's a modern kind of data that didn't exist in 1925.
I hate to tell you, but even if you have never done 23 and me or anything similar, enough of your family has that your genetic data is already very readily accessible to the parties who need it.
Realistically you cant keep that secret though. There are a lot of people who share enough of your dna to reconstruct parts of it. Possibly hundreds. And all it takes is a hair folicule or spit.
You are never keeping that secret against an interested adversary.
Your genetic data is not secret though. It's rather easy to obtain during your lifetime, even without you knowing.
Anything tied to a blockchain.
DJB made a brief critique of this argument: https://blog.cr.yp.to/20250118-flight.html#rsa
I think it is worthwhile to explore cryptography that is not based on the discrete logarithm problem. Currently, we're keeping all eggs in one conjectured basket. Even if quantum computing will never be viable, there is a non-zero chance that the discrete logarithm problem will be solved in some other way.
What are you talking about? PQC is based on learning-with-errors and lattice problems
https://en.wikipedia.org/wiki/Learning_with_errors
There is a non-zero chance of pretty much anything. I don't think Gutmann is opposed to exploration.
I think the parent's point is that if there is a non-zero chance of anything then it is good to have a backup plan
There is a non-zero chance of a black hole forming in the LHC and swallowing the Earth, yet we don’t have a backup plan for that. There are an infinite more examples like that. A non-zero chance is not an argument for anything, unless you have a good basis on actually quantifying it.
This is just one big article on survivor bias.
I get what they are saying: There is a difference between theoretical and applied.
I think the OWASP/NIST/InfoSec has always been a bit behind because of this mentality. I think there is a progressive forward looking mindset that is often seen as "mad" or "unhinged" when it's ultimately throwing paint at a wall to see what sticks.
The driver is curiosity but then someone comes along and applies CBA and ROI, and CAC...the person who was curious has left because that wasn't the goal. Eventually something will stick that meets all of those mainstream ideas.
If you think of the body as a computer, it communicates through DNA, a much larger scale of information passing. Binary is just arbitrarily selected because it was there. Should we stop exploring binary computational systems? No but we also don't need all our eggs in one basket.
This smalls a lot like wishcasting masquerading as critique. I just attended a PQC conference, so I'm biased, but this paper's author makes a lot of very strong claims about the infeasibility of future attacks developing that most experts in the field would disagree with. There is a hope that this is all a fire drill and RSA/EC will survive the next decade unscathed, but there's also plenty of evidence to suggest that incremental improvements in PQ compute will eventually reach their goal. Rather than a big cannon, I see it looking more like AI/LLMs, lots and lots of small incremental improvements by researchers were needed to eventually yield some significant advancements. I pray Post Quantum computing stays in the realm of Cold Fusion, but I'm not about to believe it.
Cryptanalysis has already made a few strides on breaking RSA more quickly, and I've heard from noted cryptanalysts the claim there's a significant chance RSA will be further broken in the next decade regardless of PQ. It's a scary take to double down on RSA of all things.
Could you relate who those noted cryptanalysts are, that are predicting a significant chance that RSA-2048 is broken classically?
Antoine Joux was on the side of classical cryptanalysis on a 2014 bet. This was right after the small-characteristic discrete log advances, so that might no longer be the bet if it was made today.
https://x.com/hashbreaker/status/494867301435318273
I don't recall their name, they were a guest on the Root Causes podcast discussing PQ topics, though your summary varies from what I was trying to express. It's not that RSA will be classically broken, but that novel attacks to reduce factoring times of RSA key like batch attacks have a statistically significant chance of being discovered, that "further" was not meant to imply "completely broken classically," but "weakened further using classical approaches". Sounded plausible to me, though that's not a thing I'm any kind of domain expert in.
Right, batch attacks certainly threaten 1024 bit RSA, but, obviously, 2048 bit RSA is not just incrementally harder to break than 1024 bit RSA.
Anything specific at all would be helpful.
https://podcasts.apple.com/us/podcast/root-causes-408-takeaw...
Sorry I'm busy, might have time to spend to look the podcast up in a day or two, but I don't think there's any actual value in that to anyone over an offhand comment so forgive me if I find other things to do instead.
Out of curiosity: which conference did you attend? I find the field interesting.
PKI Consortium: https://pkic.org/events/2025/pqc-conference-austin-us/
one thing I agree that software/standard people like to churn, churn is a source of new income regardless if QC could break RSA or not in the future, people could make profit now
Wish openssh would take that presentation to heart instead of constantly breaking everyones setups with new keys.
the main takeaway is complexity is the enemy of security; we could all agree with this no? ... anyone ever opened the thousands of pages bluetooth standard?
this is a really cool presentation to read regardless if people agree with it or not. regarding the QC stuff ... i no idea.
The NSA may not be breaking RSA right now but I'd bet good money they are storing some traffic for later. So the whole factorization isn't worth it argument feels silly.
Man, some real "Cynicism is the intellectual cripple's substitute for intelligence" energy here. Seems unnecessary given what I read of Gutmann's history.
I get it must be annoying to be someone working in cryptography and always be hearing about QC when there are endless security issues today. It must be tiring to have all these breathless pop-science articles about the quantum future, startups claiming ridiculous timelines to raise money on hype, and business seminars where consultants claim you'll need to be prepared for the quantum revolution changing how business works. I feel the same way.
But you shouldn't let that drive you so far in the opposite direction that you're extrapolating fun small quantum factoring experiments from a decade ago to factoring 1024 bit keys in the year 4000. Or say things like 'This makes the highly optimistic assumption that quantum physics experiments scale linearly... the evidence we have, shown by the lack of progress so far, is that this is not the case'. If we get fault tolerant QC of course it scales linearly and it seems embarrassing as a computer scientist to not understand the difference between constant and asymptote. "Actually, quantum computers are so new and untested that they really qualify as physics experiments"... yeah? And?
None of this is to say that fault-tolerant highly scalable QC implementing Shor's algorithm is just around the corner, I truly believe it's not. But the world of QC is making really interesting advances running some of the coolest experiments around and I find this superior Hossenfelder-like cynicism in the face of real science making real progress so so tiring.
It's strange to see so many negative responses that start with vague emotional language. It's almost as if a lot of critics didn't read the presentation. Or maybe they think the rest of us didn't read it.
I read the whole presentation. The physics experiment criticism Guttman makes that I referred to is at page 16/30. Nothing after that engages with QC to the extent that the first half of the presentation did, so I didn't refer to later parts.
Is it really that strange when the slides themselves pretty emotionally charged?
Good question. Yes it is strange, because the information on the slides is mainly numerical. For example the integers "15" and "21" and the years "2002" and "2012" don't pack much of an emotional charge for me. I suspect they wouldn't for most people.
You missed the number 15360 on p12, which is mostly what i was referring to.
One major point of the presentation here is that it's not making real progress. People are still publishing papers, but they have done nothing with an effect outside their little community. It's been in roughly the same state for the last 10 years. For a minimum of 30 years, there have been promises of amazing things coming in the next decade in QC. After how many decades should those predictions lose credibility?
There is real opportunity cost to doing this stuff, and real money getting sucked up by these grifters that could be spent on real problems. There are real PhD students getting sucked down this rabbit hole instead of doing something that actually does make progress. There is a real cost to screwing around and making promises of "next decade."
> One major point of the presentation here is that it's not making real progress.
How are you measuring "real progress"?
> People are still publishing papers, but they have done nothing with an effect outside their little community.
Having an effect outside the research community is essentially a Heaviside function. Before the field is mature enough, there is no effect outside, but once the field is mature enough, there is an effect outside. Makes it hard to judge if there is any progress or not.
The field has had 40 years of maturing. Experimentation on QC started in the 1980's. At what point are we going to be factoring numbers or (more realistically) simulating chemical interactions?
Real progress in this field is very easy to measure. It's based on number of effective qbits of computation. That is just a metric where QC is failing to deliver so badly that everyone in the field wants to deny its existence.
Unfortunately, the level of investment in QC is very much outsized compared to the level of progress. These things should rise at the same time. More promising areas of science can get the investment that is otherwise being sucked into QC.
> Having an effect outside the research community is essentially a Heaviside function.
This is something that people like to say but is never true. Impact on the outside world for new technologies is almost always a sigmoid function, not a heavyside function. You should see some residual beneficial effects at the leading edge if you have something real.
> Real progress in this field is very easy to measure. It's based on number of effective qbits of computation.
Plenty more progress measures (decoherence, gate fidelity/error rates) to use that we have made significant progress in over the last 10 years.
I agree! People who predicted QC soon over the last few decades should lose credibility. They were wrong and they were wrong for no good reason. There is a real opportunity cost to focusing on the wrong thing. There are definitely grifters in the space. Responsible QC researchers should call it out (e.g. Scott Aaronson).
But it doesn't necessarily follow that you can dismiss the actual underlying field. Within the last five years alone we've gone from the quantum supremacy experiment to multiple groups using multiple technologies to claim QEC code implementations with improved error rates over the underlying qubits. People don't have to be interested in these results, they are rather niche (a little community as you put it), but you shouldn't be uninterested and then write a presentation titled 'Why Quantum Cryptanalysis is Bollocks'.
Well, when the little community circles the wagons around the grifters instead of excising them, the rest of us get to ask questions about that community. The cold fusion community did the same thing for several decades, too.
And by the way, about 0.01% of the grifters in the QC space are getting called out right now.
Yes, but it can also be like EUV.
Not working for years upon years, and suddenly it works, and if it does it can be a huge problem. I don't know if I trust PQC though, but that only means that more research on it is needed.
Is there any recording of this being presented?
This is nonsense.
- Crypto building blocks are important basic research because it underpins everything.
- Good crypto (this exists btw) is impossible to beat, unless QC is available. That's why PQC is being researched. Think about what kind of crypto NSA wants to break, it's not your bank of america passwords.
- IDK why this guy thinks we need to shut down Los Alamos to do crypto, does he not think the NSA has datacenters of its own?
- The problem with "well, it's not a problem now, why are we preparing for it" is that nation states are storing everything that is going on in the internet, waiting for when QC becomes active. This essentially means you can assume every secret you have will not be secret in 10/20/50 years. Your password is probably fine, but if you sent secret diplomatic cable today, it might be unlocked for your adversaries some years later. These secret nation-state comms are designed to be unlocked after N years normally since keeping them secret forever is expensive; PQC is simply designed to withhold that number N.
- The NSA is generally known to be several decades ahead of the academia. They infamously knew and corrected a differential cryptography vulnerability in DES long before differential cryptanalysis was known in the public community. Saying QC isn't growing fast enough doesn't mean much.
- The 2008 financial crisis metaphor is the only one that seemed poignant
>Think about what kind of crypto NSA wants to break, it's not your bank of america passwords.
I'm not very bright, what kind of encryption does NSA want to break?
The crypto that China and Russia uses, among others.