4/6/2026 at 5:47:52 PM
It should be noted that if indeed there has not remained much time until a usable quantum computer will become available, the priority is the deployment of FIPS 203 (ML-KEM) for the establishment of the secret session keys that are used in protocols like TLS or SSH.ML-KEM is intended to replace the traditional and the elliptic-curve variant of the Diffie-Hellman algorithm for creating a shared secret value.
When FIPS 203, i.e. ML-KEM is not used, adversaries may record data transferred over the Internet and they might become able to decrypt the data after some years.
On the other hand, there is much less urgency to replace the certificates and the digital signature methods that are used today, because in most cases it would not matter if someone would become able to forge them in the future, because they cannot go in the past to use that for authentication.
The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.
OpenSSH, OpenSSL and many other cryptographic libraries and applications already support FIPS 203 (ML-KEM), so it could be easily deployed, at least for private servers and clients, without also replacing the existing methods used for authentication, e.g. certificates, where using post-quantum signing methods would add a lot of overhead, due to much bigger certificates.
by adrian_b
4/6/2026 at 5:55:54 PM
That was my position until last year, and pretty much a consensus in the industry.What changed is that the new timeline might be so tight that (accounting for specification, rollout, and rotation time) the time to switch authentication has also come.
ML-KEM deployment is tangentially touched on in the article because it's both uncontroversial and underway, but:
> This is not the article I wanted to write. I’ve had a pending draft for months now explaining we should ship PQ key exchange now, but take the time we still have to adapt protocols to larger signatures, because they were all designed with the assumption that signatures are cheap. That other article is now wrong, alas: we don’t have the time if we need to be finished by 2029 instead of 2035.
> For key exchange, the migration to ML-KEM is going well enough but: 1. Any non-PQ key exchange should now be considered a potential active compromise, worthy of warning the user like OpenSSH does, because it’s very hard to make sure all secrets transmitted over the connection or encrypted in the file have a shorter shelf life than three years. [...]
You comment is essentially the premise of the other article.
by FiloSottile
4/6/2026 at 6:04:42 PM
I agree with you that one must prepare for the transition to post-quantum signatures, so that when it becomes necessary the transition can be done immediately.However that does not mean that the switch should really be done as soon as it is possible, because it would add unnecessary overhead.
This could be done by distributing a set of post-quantum certificates, while continuing to allow the use of the existing certificates. When necessary, the classic certificates could be revoked immediately.
by adrian_b
4/6/2026 at 9:58:16 PM
> I agree with you that one must prepare for the transition to post-quantum signatures, so that when it becomes necessary the transition can be done immediately.Personally, my reading between the lines on this subject as a non-expert is that we in the public might not know when post-quantum cryptography is necessary until quite a while after it is necessary.
Prior to the public-key cryptography revolution, the state of the art in cryptography was locked inside state agencies. Since then, public cryptographic research has been ahead or even with state work. One obvious tell was all the attempts to force privately-operated cryptographic schemes to open doors to the government via e.g. the Clipper chip and other appeals to magical key escrow.
A whole generation of cryptographers grew up in this world. Quantum cryptography might change things back. We know what papers say from Google and other companies. Who knows what is happening inside the NSA or military facilities?
It seems that with quantum cryptography we are back to physics, and the government does secret physics projects really well. This paragraph really stood out to me:
> Scott Aaronson tells us that the “clearest warning that [he] can offer in public right now about the urgency of migrating to post-quantum cryptosystems” is a vague parallel with how nuclear fission research stopped happening in public between 1939 and 1940.
by snowwrestler
4/6/2026 at 10:25:22 PM
> Since then, public cryptographic research has been ahead or even with state work.How can we know that?
> Who knows what is happening inside the NSA or military facilities?
Couldn't have NSA found an issue with ML-KEM and try to convince people to use it exclusively (not in hybrid scheme with ECC)?
by raron
4/6/2026 at 11:41:13 PM
Couldn't NSA have not known about an issue with ML-KEM, and thus wanted to prevent its commercial acceptance, which it did simply by approving the algorithm?What's the PQC construction you couldn't say either thing about?
by tptacek
4/7/2026 at 11:00:11 AM
> Couldn't NSA have not known about an issue with ML-KEM, and thus wanted to prevent its commercial acceptance, which it did simply by approving the algorithm?Could, but they did not do that. So, the question is to be stated: Why?
by deknos
4/7/2026 at 2:37:13 PM
I think you may have missed my point.by tptacek
4/6/2026 at 10:48:19 PM
Follow nsa suite-b and what the USA forces on different levels of classification.by goalieca
4/7/2026 at 1:54:40 PM
Kyber/ML-KEM-only is exactly the suite b (CNSA 2) recommendation.by formerly_proven
4/6/2026 at 7:32:59 PM
Planning now on a fast upgrade later, is planning on discovering all of the critical bugs after it is too late to do much about them.Things need to be rolled out in advance of need, so that you can get a do-again in case there proves to be a need.
by btilly
4/6/2026 at 6:14:13 PM
How do you do revocation or software updates securely if your current signature algorithm is compromised?by FiloSottile
4/6/2026 at 6:24:14 PM
As a practical matter, revocation on the Web is handled mostly by centrally distributed revocation lists (CRLsets, CRLite, etc. [0]), so all you really need is:(1) A PQ-secure way of getting the CRLs to the browser vendors. (2) a PQ-secure update channel.
Neither of these require broad scale deployment.
However, the more serious problem is that if you have a setting where most servers do not have PQ certificates, then disabling the non-PQ certificates means that lots of servers can't do secure connections at all. This obviously causes a lot of breakage and, depending on the actual vulnerability of the non-PQ algorithms, might not be good for security either, especially if people fall back to insecure HTTP.
See: https://educatedguesswork.org/posts/pq-emergency/ and https://www.chromium.org/Home/chromium-security/post-quantum...
[0] The situation is worse for Apple.
by ekr____
4/6/2026 at 6:38:39 PM
Indeed, in an open system like the WebPKI it's fine in theory to only make the central authority PQ, but then you have the ecosystem adoption issue. In a closed system, you don't have the adoption issue, but the benefit to making only the central authority PQ is likely to be a lot smaller, because it might actually be the only authority. In both cases, you need to start moving now and gain little from trying to time the switchover.by FiloSottile
4/6/2026 at 6:46:35 PM
> In both cases, you need to start moving now and gain little from trying to time the switchover.There are a number of "you"s here, including:
- The SDOs specifying the algorithms (IETF mostly)
- CABF adding the algorithms to the Baseline Requirements so they can be used in the WebPKI
- The HSM vendors adding support for the algorithms
- CAs adding PQ roots
- Browsers accepting them
- Sites deploying them
This is a very long supply line and the earlier players do indeed need to make progress. I'm less sure how helpful it is for individual sites to add PQ certificates right now. As long as clients will still accept non-PQ algorithms for those sites, there isn't much security benefit so most of what you are doing is getting some experience for when you really need it. There are obvious performance reasons not to actually have most of your handshakes use PQ certificates until you really have to.
by ekr____
4/6/2026 at 6:56:48 PM
Yeah, that's an audience mismatch, this article is for "us." End users of cryptography, including website operators and passkey users (https://news.ycombinator.com/item?id=47664744) can't do much right now, because "we" still need to finish our side.by FiloSottile
4/6/2026 at 10:03:17 PM
If your HSM vendor isn't actively working on/have a release date for GA PQ, you should probably get a new vendor.by fireflash38
4/7/2026 at 7:10:12 AM
I do not understand the fixation on authentication/signatures. They have different threat characteristics:You cannot retroactively forge historical authentication sessions, and future forgery ability does not compromise past data, and it only matters for long-lived signed artifacts (certificates, legal documents, etc.), yet the thread apparently keeps pivoting to signature deployment complexity?
I do not get it.
by johnisgood
4/7/2026 at 7:58:21 AM
The argument is that deploying PQ-authentication mechanisms takes time. If the authenticity of some connections (firmware signatures, etc…) is critical to you and news comes out that (")cheap(") quantum attacks are going to materialize in six months, but you need at least twelve months to migrate, you are screwed.There is also a difference between closed ecosystems and systems that are composed of components by many different vendors and suppliers. If you are Google, securing the connection between data centers on different continents requires only trivial coordination. If you are an industrial IoT operator, you require dozens of suppliers to flock around a shared solution. And for comparison, in the space of operation technology ("OT"), there are still operators that choose RSA for new setups, because that is what they know best. Change happens in a glacial pace there.
by Perseids
4/7/2026 at 1:40:49 PM
The actual revocation needn't be secure. False revocations are an oxymoron.The practice around revocations need to be secure of course, but that's more on an engineering problem than a cryptographical.
by xorcist
4/7/2026 at 1:49:41 PM
Can you explain a bit more what you mean by "secure" in the context of "actual revocations"? The oxymoronic nature isn't self-evident enough for me to catch your intended meaning before my first cup of coffee.by some_furry
4/7/2026 at 4:38:35 PM
If you receive a forged crl, in the worst case it will revoke certificates that you can't trust anyway. Even if it says "certificate X is still good", that's equivalent to receiving no crl.by GoblinSlayer
4/7/2026 at 11:00:57 AM
Use PSK signature.by GoblinSlayer
4/7/2026 at 6:00:03 AM
> when it becomes necessaryPerhaps it's already necessary, or it will be in the following months. We are hearing only about the public developments, not whatever classified work the US is doing
I think the analogy with the Manhattan project is apt. The US has enormous interest in decrypting communication streams at scale (see Snowden and the Utah NSA datacenter), and it's known for storing encrypted comms for decrypting later. Well maybe later is now
by nextaccountic
4/7/2026 at 7:32:31 AM
Super important: Don't replace traditional (elliptic curve) Diffie-Hellman with ML-KEM, but enhance it by using hybrid key exchanges. Done thusly, you need to break both the classical and post-quantum cryptography to launch an attack.If you worry about a >=1% risk of quantum attacks being available soon, you should also worry about a >=1% risk of the relatively new ML-KEM being broken soon. The risk profile is pretty comparable. For both cases there are credible expert opinions that say the risk is incredibly overrated and credible expert opinions that say the risk is incredible underrated.
Filippo has linked opinions that quantum attacks are right around the corner. People like Dan Bernstein (djb) are throwing all their weight to stress that anything but hybrids are irresponsible. I don't think there is anybody that says "hybrids are a bad idea", just people that want to make it easy to choose non-hybrid ML-KEM.
by Perseids
4/7/2026 at 11:25:39 AM
How do you mean the risk profile is comparable, when ECDH is nearly guaranteed to be broken in five years and Kyber is two decades old? The two have nothing to do with each other, the ECDH component of a hybrid becomes worthless before you next replace your smartphone, and bloating the protocol can only hurt adoption. Yes, djb keeps making the same crankish complaint without any evidence or reason, that doesn't mean you have to repeat it uncritically.by pie_flavor
4/7/2026 at 1:26:22 PM
> How do you mean the risk profile is comparableExactly in the way the succeeding sentence defines: "For both cases there are credible expert opinions that say the risk is incredibly overrated and credible expert opinions that say the risk is incredible underrated."
> when ECDH is nearly guaranteed to be broken in five years
Most of your argument (and that of many others pushing the contra-hybrid point) hinges on this. I don't think this position is justified. I believe there is significant risk for quantum attacks in the near term (and thus fully support the speedy adoption of hybrids), yes, but quite far away from certainty. Personally, I'd even say better than coin-flip is pushing it. I mean, look at what Scott Aaronson is writing on that matter:
"I also continue to profess ignorance of exactly how many years it will take to realize those principles in the lab, and of which hardware approach will get there first. […] This year [=2025] updated me in favor of taking more seriously the aggressive pronouncements—the “roadmaps”—of Google, Quantinuum, QuEra, PsiQuantum, and other companies about where they could be in 2028 or 2029." -- https://scottaaronson.blog/?p=9425
This is nothing like "nearly guaranteed" in five years.
> and Kyber is two decades old
But the implementations aren't and it's not been under heavy scrutiny for that long. One can very much make the point that we weren't that critical when elliptic curve cryptography entered the scene, but we do now have the luxury to have these heavily battle-tested primitives and implementations at our disposal, so why throw them out of the window so eagerly? Also an interesting comparison to elliptic curve cryptography is that it took until 2005 to get good key exchanges primitives and until 2011 to get good signature primitives (Curve25519, now known as X25519, and Ed25519 respectively) and mainstream availability of those took waaaay longer.
Coming back to this again, for second remark:
> when ECDH is nearly guaranteed to be broken in five years
Another important point is all quantum attack on ECDH will require inherently expensive equipment for the foreseeable future, see adgjlsfhk1's comment https://news.ycombinator.com/item?id=47665561 , whereas a stupid Kyber implementation error in a mainstream library can very likely end up being attackable by a Metasploit plugin. Our threat model should most definitely include nation state attackers prominently, but these are not at all the only attackers that we should focus on. There is still significant value in keeping out attackers that did not spend >100k$ on equipment.
> Yes, djb keeps making the same crankish complaint without any evidence or reason, that doesn't mean you have to repeat it uncritically.
I did not repeat it uncritically, I just happen to share his conclusion, even after months of following the pro and contra discussion. Also, how can you say he complains without reason? He has explained them at length, see https://cr.yp.to/2025/20250812-non-hybrid.pdf for example. Whether his methods of complaining are commendable or effective is another topic, though.
by Perseids
4/7/2026 at 2:39:33 PM
I would be interested in seeing you rattle off the "pros and cons" of this argument, just as a synchronization mechanism for the thread so we'd know if we're on the same page.by tptacek
4/7/2026 at 4:25:41 PM
Off the top of my head?Pro hybrid: Negligible performance impact (negligible for battery devices, negligible for data send over the wire (number of packets -> sub-discussion about specific circumstances, time on the air for cellular), negligible for speed, negligible code size increase), little implementation effort as every library already has ECC in it, ML-KEM is too new (yes actually old, but far less research interest, implementations new), conservative design choice
Pro ML-KEM only / produce a TLS RFC for non-hybrid ML-KEM: Reduction in complexity, reduction of transitions (non-hybrid is going to be the final state, so lets skip ahead already), lattice crypto is actually an old branch of cryptography (discussion over different metrics), NSA says its secure for government use, NSA stipulates use of non-hybrid and we want/need to be compatible, we want/need to have a well defined place to have a reference, if people are going to write an RFC to document non-hybrid ML-KEM let us at least have influence over what is written there, better performance (speed, data on the wire, number of packets in handshake, energy budget), actually the non-hybrid TLS connection is intended to be the inner one while the outer transport is secured with classic cryptography (or vice versa) so hybrids are a complete waste, for any interesting timeline ECC is broken anyway so it is a useless burden, we just want choice dammit, don't undermine the process dammit.
Pro hybrid only / don't produce a TLS RFC for non-hybrid ML-KEM: Let's not make it easy for people to choose wrongly by accident/incompetence/malice, actually no complexity reduction as implementations still need to implement hybrids to be compatible, TLS WG publishing something has weight and might sway others to consider non-hybrid ML-KEM, NSA might have pushed for non-hybrid ML-KEM because they believe only they can break it, don't care if US institutions are pushing for non-hybrid ML-KEM for weird internal political reasons, don't you see how this is all a ploy to weaken our crypto again?, don't undermine the process dammit.
Did I forget any important talking point? The TLS WG discussion is actually quite tiresome. For anybody new the party, here is a random pointer for a current thread: https://mailarchive.ietf.org/arch/msg/tls/7OGS_X1e-zG8O0eRJP...
by Perseids
4/8/2026 at 2:24:17 AM
one more Pro hybrid only: reduction of transitions is doubtful since by the time PQC is clearly better, we're likely to have better PQC algorithms (and or better attacks that force more conservative parameters). At a bare minimum, we aren't ready to move to pure PQC until we can go a couple years without continued improvements in lattice reduction algorithms.by adgjlsfhk1
4/8/2026 at 4:15:40 AM
This is like saying we should have halted all RSA deployments until improvements in sieving stopped happening. The lattice contestants were all designed assuming BKZ would continually improve. It's not 1994 anymore, asymmetric cryptography is not a huge novelty to the industry, nobody is doing the equivalent of RSA-512.by tptacek
4/8/2026 at 7:02:17 AM
> This is like saying we should have halted all RSA deployments until improvements in sieving stopped happening.Absolutely not. If people were advocating for ECC only, you would have a point. But this thread is about hybrids vs ML-KEM-only (for key exchange!). Everybody here wants to deploy the algorithm your favoring and wants to deploy it now, just not without a safety net.
by Perseids
4/8/2026 at 1:09:48 PM
I don't understand. We didn't have hybrids for RSA while sieving improved.by tptacek
4/8/2026 at 1:48:16 PM
RSA was the first. If ECC didn't exit, no one would be saying that we have to hybridize Kyber, but since it does, and the hybrid has ~0% overhead, it's very silly not to.by adgjlsfhk1
4/7/2026 at 1:07:04 PM
> when ECDH is nearly guaranteed to be broken in five yearsSays who?
There's a big difference between “we can't be sure that ECDH stays secure for five more years” and “ECDH is nearly guaranteed to be broken”. There has been two major papers in the beginning of the year that advanced the state of the art enough to question the prior assumption about the slowness of QC progress. Now we know that rapid advances are possible and we must take that into account in risk assessment. But that doesn't mean that rapid advances are guaranteed. Things could stay stagnant for 15 more years at this point before the next breakthrough. And if that's the case, then ECDH could very well remain relevant for the remaining century.
We just cannot know if it happens, so we can't take the risk. But that doesn't mean that we are certain that the risk will materialize.
by littlestymaar
4/6/2026 at 6:04:42 PM
> The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.This very much exists. In particular, the cryptographic timestamps that are supposed to protect against future tampering are themselves currently using RSA or EC.
by layer8
4/6/2026 at 6:37:28 PM
Yes, though we do know how to solve this problem by using hash-based timestamping systems. See: https://link.springer.com/article/10.1007/BF00196791Of course, the modern version of this is putting the timestamp and a hash of the signature on the blockchain.
by ekr____