12/27/2025 at 6:45:31 PM
Okay, since there’s so much stuff to digest here and apparently there are issues designated as wontfix by GnuPG maintainers, can someone more in the loop tell us whether using gpg signatures on git commits/tags is vulnerable? And is there any better alternative going forward? Like is signing with SSH keys considered more secure now? I certainly want to get rid of gpg from my life if I can, but I also need to make sure commits/tags bearing my name actually come from me.by oefrha
12/27/2025 at 7:44:22 PM
One of those WONTFIX's is on an insane vulnerability: you can bitflip known plaintext in a PGP message to switch it into handling compression, allowing attackers to instruct GnuPG packet processing to look back to arbitrary positions in the message, all while suppressing the authentication failure message. GPG's position was: they print, in those circumstances, an error of some sort, and that's enough. It's an attack that reveals plaintext bytes!by tptacek
12/27/2025 at 10:14:35 PM
Are you referring to "Encrypted message malleability checks are incorrectly enforced causing plaintext recovery attacks"?Seems like a legitimate difference of opinion. The researcher wants a message with an invalid format to return an integrity failure message. Presumably the GnuPGP project thinks that would be better handled by some sort of bad format error.
The exploit here is a variation on the age old idea of tricking a PGP user into decrypting an encrypted message and then sending the result to the attacker. The novelty here is the idea of making the encrypted message look like a PGP key (identity) and then asking the victim to decrypt the fake key, sign it and then upload it to a keyserver.
Modifying a PGP message file will break the normal PGP authentication[1] (that was not acknowledged in the attack description). So here is the exploit:
* The victim receives a unauthenticated/anonymous (unsigned or with a broken signature) message from the attacker. The message looks like a public key.
* Somehow (perhaps in another anonymous message) the attacker claims they are someone the victim knows and asks them to decrypt, sign and upload the signed public key to a keyserver.
* They see nothing wrong with any of this and actually do what the attacker wants ignoring the error message about the bad message format.
So this attack is also quite unlikely. Possibly that affected the decision of the GnuPG project to not change behaviour in this case, particularly when such a change could possibly introduce other vulnerabilities.
[1] https://articles.59.ca/doku.php?id=pgpfan:pgpauth
Added: Wait. How would the victim import the bogus PGP key into GPG so they could sign it? There would normally be a preexisting key for that user so the bogus key would for sure fail to import. It would probably fail anyway. It will be interesting to see what the GnuPG project said about this in their response.
by upofadown
12/27/2025 at 10:24:51 PM
In the course of this attack, just in terms of what happens in the mechanics of the actual protocol, irrespective of the scenario in which these capabilities are abused, the attacker:(1) Rewrites the ciphertext of a PGP message
(2) Introducing an entire new PGP packet
(3) That flips GPG into DEFLATE compression handling
(4) And then reroutes the handling of the subsequent real message
(5) Into something parsed as a plaintext comment
This happens without a security message, but rather just (apparently) a zlib error.
In the scenario presented at CCC, they used the keyserver example to demonstrate plaintext exfiltration. I kind of don't care. It's what's happening under the hood that's batshit; the "difference of opinion" is that the GnuPG maintainers (and, I guess, you) think this is an acceptable end state for an encryption tool.
by tptacek
12/27/2025 at 9:30:59 PM
Is there a better alternative to GPG?by akulbe
12/27/2025 at 9:34:47 PM
Everything is better than PGP (not just GPG --- all PGP implementations).The problem with PGP is that it's a Swiss Army Knife. It does too many things. The scissors on a Swiss Army Knife are useful in a pinch if you don't have real scissors, but tailors use real scissors.
Whatever it is you're trying to do with encryption, you should use the real tool designed for that task. Different tasks want altogether different cryptosystems with different tradeoffs. There's no one perfect multitasking tool.
When you look at the problem that way, surprisingly few real-world problems ask for "encrypt a file". People need backup, but backup demands backup cryptosystems, which do much more than just encrypt individual files. People need messaging, but messaging is wildly more complicated than file encryption. And of course people want packet signatures, ironically PGP's most mainstream usage, ironic because it relies on only a tiny fraction of PGP's functionality and still somehow doesn't work.
All that is before you get to the absolutely deranged 1990s design of PGP, which is a complex state machine that switches between different modes of operation based on attacker-controlled records (which are mostly invisible to users). Nothing modern looks like PGP, because PGP's underlying design predates modern cryptography. It survives only because nerds have a parasocial relationship with it.
by tptacek
12/27/2025 at 11:39:01 PM
> It survives only because nerds have a parasocial relationship with it.I really would like to replace PGP with the "better" tool, but:
* Using my Yubikey for signing (e.g. for git) has a better UX with PGP instead of SSH
* I have to use PGP to sign packages I send to Maven
Maybe I am a nerd emotionally attached to PGP, but after a year signing with SSH, I went back to PGP and it was so much better...
by palata
12/28/2025 at 4:04:37 AM
> better UX with PGP instead of SSHThis might be true of comparing GPG to SSH-via-PIV, but there's a better way with far superior UX: derive an SSH key from a FIDO2 slot on the YubiKey.
by computerfriend
12/28/2025 at 11:54:11 AM
I do it with FIDO2. It's inconvenient when having multiple Yubikeys (I always end up adding the entry manually with ssh-agent), and I have to touch the Yubikey everytime it signs. That makes it very annoying when rebasing a few tens of commits, for instance.With GPG it just works.
by palata
12/28/2025 at 1:34:04 PM
For what it's worth: You can set no-touch-required on a key (it's a generation-time option though).by ahlCVA
12/28/2025 at 2:08:20 PM
Sure, but then it is set to no-touch for every FIDO2 interaction I have. I don't want to touch for signing, but I want to touch when using it as a passkey, for instance.by palata
12/28/2025 at 11:47:19 PM
This is a per-credential setting, so you can have your SSH signing key be a no-touch key and still use touch confirmation for everything else.(see "uv" option here https://fidoalliance.org/specs/fido-v2.0-ps-20190130/fido-cl... - the -sk key types in SSH are just a clever way of abusing the FIDO protocol to create a signing primitive)
by ahlCVA
12/29/2025 at 4:39:40 PM
Oh, I need to check this! Thanks!by palata
12/28/2025 at 3:04:31 PM
Use the PIV applet for SSH and signing Git commits instead? Git supports S/MIME and SSH can use keys over PKCS#11 basically out-of-box on OSs that don't ship gpg-agent (that just interferes with SmartCard usage in general).by Avamander
12/27/2025 at 9:36:48 PM
Now can you give us a list of all the features of PGP and a tool that does one specific thing really well?by johnisgood
12/27/2025 at 9:49:26 PM
https://soatok.blog/2024/11/15/what-to-use-instead-of-pgp/I wrote this to answer this exact question last year.
by some_furry
12/28/2025 at 12:08:06 PM
> The only downside to Sigstore is it hasn’t been widely adopted yet.Which, from where I stand, means that PGP is the only viable solution because I don't have a choice. I can't replace PGP with Sigstore when publishing to Maven. It's nice to tell me I'm dumb because I use PGP, but really it's not my choice.
> Use SSH Signatures, not PGP signatures.
Here I guess it's just me being dumb on my own. Using SSH signatures with my Yubikeys (FIDO2) is very inconvenient. Using PGP signatures with my Yubikeys literally just works.
> Encrypted Email: Don’t encrypt email.
I like this one, I keep seeing it. Sounds like Apple's developer support: if I need to do something and ask for help, the answer is often: "Don't do it. We suggest you only use the stuff that just works and be happy about it".
Sometimes I have to use emails, and cryptographers say "in that case just send everything in plaintext because eventually some of your emails will be sent in plaintext anyway". Isn't it like saying "no need to use Signal, eventually the phone of one of your contacts will be compromised anyway"?
by palata
12/28/2025 at 8:51:50 PM
The fact that every email encryption integration exports secure context messages into insecure contexts when decrypting (which is how encrypted messages end up cited in plaintext) means email can't be secured.This is true both for GPG and S/MIME
Email encryption self-compromises itself in a way Signal doesn't
by Natanael_L
12/28/2025 at 5:43:27 PM
> Which, from where I stand, means that PGP is the only viable solution because I don't have a choice.You don't have a choice today. You could have a choice tomorrow if enough people demanded it.
Don't let PGP's convenience (in this context) pacify you from making a better world possible.
by some_furry
12/28/2025 at 11:05:57 PM
I agree with that. But I feel like I have been reading for years that there is really no reason to use PGP, and I have tried for years to use alternatives, but the fact remains that I still need to use PGP, either because it is mandatory or because in some cases the alternatives are not practical.To me, there will be no reason to use PGP the day I find practical alternatives for the remaining use-cases I have. And I feel like signing git commits is not a weird use-case...
by palata
12/28/2025 at 11:30:21 PM
Does the GnuPG project sign its git commits with PGP?by tptacek
12/30/2025 at 5:30:39 PM
Of course it does, and all released software and tarballs as well.by KooBaa
12/30/2025 at 5:25:50 PM
Not sure what you are trying to sayby palata
12/28/2025 at 7:34:04 AM
offtopic question:as a recent dabbling reader of introductory popsci content in cryptography, I've been wondering about what are the different segmentation of expert roles in the field?
e.g. in Filippo's blogpost about Age he clarified that he's not a cryptographer but rather a cryptography engineer, is that also what your role is, what are the concrete divisions of labor, and what other related but separate positions exists in the overall landscape?
where is the cutoff point of "don't roll your own crypto" in the different levels of expertise?
by xeonmc
12/28/2025 at 9:05:55 PM
There's no clear segmentation. There's symmetric and asymmetric primitives (and stuff that doesn't fit into these like ZKP), algorithms, protocols, research in many different types of attacks against each of these, research in design and defenses, and plenty of people will cover completely different subsets."don't" roll your own cover everything from "don't design your own primitive" to "don't make your own encryption algorithm/mode" to "don't make your own encryption protocol", to "don't reimplement an existing version of any of the above and just use an encryption library"
(and it's mostly "don't deploy your own", if you want to experiment that's fine)
by Natanael_L
12/28/2025 at 10:28:46 PM
I wonder if there is a concrete point at which it turn into “this is common sense security that even you should know about” like not conflating hashing and encryption, or “you should just have someone else do do security for you”? I guess at larger entities you have a CISO role but what about in smaller, scrappy endeavours, how does one know where one is at the limit of their due-commonsense and hand it off?by xeonmc
12/28/2025 at 10:32:25 PM
Most practitioners in security --- from information security to compliance to systems security to software security to red-teaming --- have very little competence with cryptography. Cryptography is hyperspecialized. It is not part of the toolkit of any ordinary professional.(That's nothing to do with how hard cryptography is, just with how little demand there is for serious cryptography engineering, especially compared with the population of people who have done serious academic study of it.)
by tptacek
12/28/2025 at 5:56:01 PM
There isn't one, but the modal professional cryptography engineer probably has a graduate degree in cryptography.by tptacek
12/28/2025 at 5:46:02 PM
My job title is in the Security Engineer family.I do not have a Ph.D in Cryptography (not even an honorary one), so I do not call myself a Cryptographer. (Though I sometimes use "Cryptografur" in informal contexts for the sake of the pun.)
by some_furry
12/30/2025 at 4:30:11 AM
What you actually want doing crypto is a security engineer, not a cryptographer. To quote Shamir's Law, "cryptography is bypassed, not attacked". No-one ever attacks the crypto, they attack the way it's used, so you need an experienced cryptoplumber to set it up correctly, not a cryptographer who will design a mathematically elegant whatsit and announce "there, solved!".Ideally, this person will also design the system that uses the crypto, because no matter how skilled the people on a standards committee might be their product will always be, at best, a baroque nightmare with near-infinite attack surface, at worst an unusable pile of crap. IPsec vs. Wireguard is a prime example, but there are many others.
by pseudohadamard
12/28/2025 at 10:23:00 PM
Interesting. In a general sense, where does it fall on the xkcd#435 scale?by xeonmc
12/28/2025 at 8:02:30 AM
You did not ask me, but you should do your due diligence because there are way too many armchair cryptographers around here.by johnisgood
12/28/2025 at 10:13:39 AM
This is exactly that, in more detail than you could possibly ever ask for:by miki123211
12/27/2025 at 9:47:00 PM
https://www.latacora.com/blog/2019/07/16/the-pgp-problem/#th...by akerl_
12/27/2025 at 11:30:38 PM
> Use Signal. Or Wire, or WhatsApp, or some other Signal-protocol-based secure messenger.That's a "great" idea considering the recent legal developments in the EU, which OpenPGP, as bad as it is, doesn't suffer from. It would be great if the author updated his advice into something more future-proof.
by jhgb
12/27/2025 at 11:52:38 PM
There's no future-proof suggestion that's immune to the government declaring it a crime.If you want a suggestion for secure messaging, it's Signal/WhatsApp. If you want to LARP at security with a handful of other folks, GPG is a fine way to do that.
by akerl_
12/28/2025 at 5:02:11 PM
> If you want a suggestion for secure messaging, it's Signal/WhatsApp. If you want to LARP at security with a handful of other folks, GPG is a fine way to do that.I want secure messaging, not encrypted SMS. I want my messages to sync properly between arbitrary number of devices. I want my messaging history to not be lost when I lose a device. I want not losing my messaging history to not be a paid feature. I want to not depend on a shady crypto company to send a message.
by goldsteinq
12/28/2025 at 5:51:26 PM
I seriously don't care what messenger you use, as long as it isn't email, which can't be made secure. Pick something open source. It'll be less secure than Signal, but way more secure than email.by tptacek
12/28/2025 at 8:23:20 PM
Then your next best bet is Matrix.org. Not to the same security standard as Signal, but if you don't have a specific threat against you then it's fine.by Natanael_L
12/28/2025 at 8:35:35 PM
Pros of Matrix: it actually has a consistent history (in theory); no vendor lock-in. Cons of Matrix: encryption breaks constantly. Right now I’m stuck in a fun loop of endlessly changing recovery keys: https://github.com/element-hq/element-web/issues/31392by goldsteinq
12/28/2025 at 10:15:41 PM
bleurgh. that issue is very actively under investigation (modulo xmas). please can you submit debug logs from Element Web referencing that issue.by Arathorn
12/29/2025 at 12:19:07 AM
I’m facing it on Element Desktop, but I’ll try to reproduce it on Element Web. I’ve tried to submit logs from Element Desktop, but it says that `/rageshake` (which I was told to do) is not a command. I’m happy to help with debugging this, but I’m not sure how to submit logs from Desktop.Something like this happens basically every time I try to use Matrix though. Messages are not decrypting, or not being delivered, or devices can’t be authenticated for some cryptic reason. The reason I even tried to use Element Desktop is because my nheko is seemingly now incapable of sending direct messages (the recepient just gets infinite “waiting for message”).
by goldsteinq
12/29/2025 at 1:07:13 AM
Weird. Encryption these days (in Element Web/Desktop and Element X at least) should be pretty robust - although this whole identity reset thing is a known bug on Element Web/Desktop. You can submit debug logs from Settings: Help & About: Submit Debug Logs, and hopefully that might give a hint on what's going wrong.by Arathorn
12/29/2025 at 10:04:46 AM
No “Submit Debug Logs” there, as far as I can see. Do I need to be on matrix.org homeserver for this to work or something?https://photos.goldstein.lol/share/OIgowBN4Wmi4zlm8DmDP0s8jH...
by goldsteinq
12/29/2025 at 8:57:27 PM
looks like whoever’s run that Element has disabled debug log reporting. not sure i can do much to help here :/by Arathorn
12/28/2025 at 5:33:27 PM
> I want secure messaging, not encrypted SMS.I send long messages via Signal, typed on a desktop computer, all the time. (In fact, I almost exclusively use Signal through my desktop app.)
You don't have to use it like "encrypted SMS"! You're free.
> I want my messages to sync properly between arbitrary number of devices. I want my messaging history to not be lost when I lose a device.
OK. https://signal.org/blog/a-synchronized-start-for-linked-devi...
> I want not losing my messaging history to not be a paid feature.
I genuinely don't understand what you mean here. From https://signal.org/blog/introducing-secure-backups/
"If you do decide to opt in to secure backups, you’ll be able to securely back up all of your text messages and the last 45 days’ worth of media for free."
If you have a metric fuckton of messages, that does cost money, sure, but as they say:
"If you want to back up your media history beyond 45 days, as well as your message history, we also offer a paid subscription plan for US$1.99 per month."
"This is the first time we’ve offered a paid feature. The reason we’re doing this is simple: media requires a lot of storage, and storing and transferring large amounts of data is expensive. As a nonprofit that refuses to collect or sell your data, Signal needs to cover those costs differently than other tech organizations that offer similar products but support themselves by selling ads and monetizing data."
If you want Signal to host the encrypted storage, that costs money. If you don't want to pay Signal money, they provide 45 days of backup for free.
If you want to self-host your own backups (at your own cost), that's easy to do.
You can literally set up SyncThing to stream your on-device backups to your NAS, cloud storage, or whatever.
> I want to not depend on a shady crypto company to send a message.
Shady crypto company?
Are you referring to MobileCoin? That feature isn't in the pipeline for sending messages.
I checked! https://soatok.blog/2025/02/18/reviewing-the-cryptography-us...
by some_furry
12/28/2025 at 7:51:37 PM
> You don't have to use it like "encrypted SMS"! You're free.Using it as something more than encrypted SMS requires persistent message history between devices.
> metric fuckton of messages
“More than 45 days” is a metric fuckton? Seriously?
> If you want Signal to host the encrypted storage, that costs money. If you don't want to pay Signal money, they provide 45 days of backup for free.
I don’t want Signal to store my messages. I want Signal to not lock in my messages on their servers, so I can sync them between my devices and back them up into my own backups.
> If you want to self-host your own backups (at your own cost), that's easy to do.
Except there’s no way to move it between platforms. I have more than one device.
> Are you referring to MobileCoin? That feature isn't in the pipeline for sending messages.
I don’t want shady crypto company to hold my data hostage, and there’s no way to store it on my hardware and then move it between platforms. That’s my problem with signal.
> A Synchronized Start for Linked Devices
It only properly transfers 45 days. You can’t have more than one phone. Phones are special “primary devices” and AFAIK you can’t restore your messages if you lose your phone even if you have logged-in Signal Desktop.
by goldsteinq
12/28/2025 at 8:19:32 PM
I literally included a screenshot that shows you can setup backups in a directory on your device and then use your own backup solution.Signal is not holding you hostage.
by some_furry
12/28/2025 at 8:28:50 PM
Yes, if your only device is a single Android phone you can do that. You can’t, however, use that backup to populate your message history on other platforms.I’ve already lost message history consistency because one of my devices was offline for too long. The messages are there on my other device, but Signal refuses to let me copy my data from one of my devices to another. Signal is, quite literally, worse at syncing message history than IRC — at least with IRC I can set up a bouncer and have a consistent view of history on all of my devices, but there’re no Signal bouncers.
by goldsteinq
12/28/2025 at 8:44:38 PM
Look, if defending "message history consistency" is a reason you're choosing some other secure messenger rather than Signal, then I don't think this argument is very productive; use some other secure messenger then. But if "message history consistency" is a reason you're endorsing encrypted email over Signal, you're committing malpractice.The point is that whatever secure messenger you use, it must plausibly be secure. Email cannot plausibly be made secure. Whatever other benefits you might get from using it --- federation, open source, UX improvements, universality --- come at the cost of grave security flaws.
Most people who use encrypted email are doing so in part because it does not matter if any of their messages are decrypted. They simply aren't interesting or valuable. But in endorsing a secure messenger of any sort, you're influencing the decisions of people whose messages are extremely sensitive, even life-or-death sensitive. For those people, federation or cross-platform support can't trump security, and as practitioners we are obligated to be clear about that.
by tptacek
12/28/2025 at 8:56:50 PM
I’m definitely not “commiting malpractice” on account of not being a security practicioner. I’m talking from a perspective of a user.It’s important to me — as a user — that a communication tool doesn’t lose my data, and Signal already did. Actual practicioners keep recommending Signal and sure, I believe that in a weird scenario where my encryption keys are somehow compromised without also compromising my local message history, Signal’s double-ratchet will do wonders — but it doesn’t actually work as a serious communication tool.
It’s also kinda curious that while the “email cannot be made secure” mantra is constantly repeated online, basically every organization that needs secure communication uses email. Openwall are certainly practicioners, and they use PGP-over-email: are they commiting malpractice?
by goldsteinq
12/28/2025 at 9:58:28 PM
Very few organizations need security from state level or similar threats and the infrastructure provider. Most organizations that want secure email don't use any kind of e2ee at all, they just trust Google or Microsoft or whomever.The few jobs that actually care about this stuff, like journalists, do use signal.
Openwall doesn't get security via pgp, it gets a spam filter.
by joshuamorton
12/28/2025 at 9:14:47 PM
> but it doesn’t actually work as a serious communication tool.Say more. Plenty of people use Signal as a serious communication tool.
> Openwall are certainly practicioners, and they use PGP-over-email: are they commiting malpractice?
They, and other communities that use GPG-encrypted emails are LARPing, and it’s only fine because their emails don’t actually matter enough for anybody to care about compromising them.
It’s not malpractice to LARP: plenty of people love getting out their physical or digital toys and playing pretend. But if you’re telling other people that your foam shield can protect them from real threats, you are lying.
by akerl_
12/28/2025 at 9:22:46 PM
> Say more. Plenty of people use Signal as a serious communication tool.I did say more already. Maybe you believe in serious communication tools that can’t synchronize searchable history between devices, but I don’t.
> They, and other communities that use GPG-encrypted emails are LARPing, and it’s only fine because their emails don’t actually matter enough for anybody to care about compromising them.
Are we talking about the same Openwall? Are you aware what Openwall’s oss-security mailing list is? Please, do elaborate how nobody cares about getting access to an unlimited stream of zerodays for basically every Unix-like system.
by goldsteinq
12/28/2025 at 10:30:09 PM
At this point you're just repeating the argument you made upthread without responding to any of its rebuttals. That's fine; I too am comfortable with the arguments on this thread as they stand. Let's save each other some time and call it here.by tptacek
12/28/2025 at 9:28:03 PM
I’m very familiar with oss-security, a public mailing list that doesn’t really have anything to do with GPG-encrypted emails. Encrypting emails to a public mailing list, with GPG or otherwise, wouldn’t really make sense.by akerl_
12/28/2025 at 9:42:36 PM
Okay, sorry, not oss-security mailing list, oss-security _distros_ mailing list.https://oss-security.openwall.org/wiki/mailing-lists/distros
> Only use these lists to report security issues that are not yet public
> To report a non-public medium or high severity 2) security issue to one of these lists, send e-mail to distros [at] vs [dot] openwall [dot] org or linux [dash] distros [at] vs [dot] openwall [dot] org (choose one of these lists depending on who you want to inform), preferably PGP-encrypted to the key below.
by goldsteinq
12/28/2025 at 9:50:40 PM
Yes, that would be an example of LARPing security. The obviously indicator is that encrypting your message is entirely optional, per their own instructions. The less obvious bit is that even if you encrypt your message, anyone without GPG configured who replies has stripped any attempt at encryption from the contents.by akerl_
12/28/2025 at 9:01:39 PM
Yes.by tptacek
12/28/2025 at 12:32:03 AM
Nobody decided that it's a crime, and it's unlikely to happen. Question is, what do you do with mandatory snooping of centralized proprietary services that renders them functionally useless aside from "just live with it". I was hoping for actual advice rather than a snarky non-response, yet here we are.by jhgb
12/28/2025 at 12:58:14 AM
> Nobody decided that it's a crime, and it's unlikely to happen.Which jurisdiction are you on about? [1] Pick your poison.
For example, UK has a law forcing suspects to cooperate. This law has been used to convict suspects who weren't cooperating.
NL does not, but police can use force to have a suspect unlock a device using finger or face.
by Fnoord
12/28/2025 at 10:48:20 AM
You're asking for a technical solution to a political problem.The answer is not to live with it, but become politically active to try to support your principles. No software can save you from an authoritarian government - you can let that fantasy die.
by closewith
12/28/2025 at 12:37:47 AM
I gave you the answer that exists: I'm not aware of any existing or likely-to-exist secure messaging solution that would be a viable recommendation.The available open-source options come nowhere close to the messaging security that Signal/Whatsapp provide. So you're left with either "find a way to access Signal after they pull out of whatever region has criminalized them operating with a backdoor on comms" or "pick any option that doesn't actually have strong messaging security".
by akerl_
12/28/2025 at 2:05:05 AM
> messaging securityEh?
There are alternatives, try Ricochet (Refresh) or Cwtch.
by johnisgood
12/28/2025 at 2:51:22 AM
I stand by what I said.by akerl_
12/28/2025 at 7:58:54 AM
I mean... why?by johnisgood
12/28/2025 at 10:45:25 AM
Not the GP, but most of us want to communicate with other people, which means SMS or WhatsApp. No point have perfect one-time-pad encryption and no one to share pads with.by closewith
12/28/2025 at 4:29:38 PM
Most countries will throw you in jail for years if you refuse to give the password to encrypted devices they want. [1]And that's even if you are innocent on the underlying charge or search.
Encryption in this political climate, is a pick your poison.
- Either you go to jail for years but you know your gov and other actors has no access to your data.
- or you store on remote/proprietary apps, stay free, but your gov or other actors may or may not have access to it.
by signed-log
12/28/2025 at 2:27:06 AM
Could you please link the source code for the WhatsApp client, so that we can see the cryptographic keys aren't being stored and later uploaded to Meta's servers, completely defeating the entire point of Signal's E2EE implementation and ratchet protocol?by anonym29
12/28/2025 at 2:57:04 AM
This may shock you, but plenty of cutting-edge application security analysis doesn't start with source code.There are many reasons, but one of them is that for the overwhelming majority of humans on the planet, their apps aren't being compiled from source on their device. So since you have to account for the fact that the app in the App Store may not be what's in some git repo, you may as well just start with the compiled/distributed app.
by akerl_
12/28/2025 at 3:22:12 AM
Whether or not other people build from source code has zero relevance to a discussion about the trustworthiness of security promises coming from former PRISM data providers about the closed-source software they distribute. Source availability isn't theater, even when most people never read it, let alone build from it. The existence of surreptitious backdoors and dynamic analysis isn't a knock against source availability.Signal and WhatsApp do not belong in the same sentence together. One's open source software developed and distributed by a nonprofit foundation with a lengthy history of preserving and advancing accessible, trustworthy, verifiable encrypted calling and messaging going back to TextSecure and RedPhone, the other's a piece of proprietary software developed and distributed by a for-profit corporation whose entire business model is bulk harvesting of user data, with a lengthy history of misleading and manipulating their own users and distributing user data (including message contents) to shady data brokers and intelligence agencies.
To imply these two offer even a semblance of equivalent privacy expectations is misguided, to put it generously.
by anonym29
12/28/2025 at 7:31:12 PM
These are words, but I don't understand how they respond to the preceding comment, which observes that binary legibility is an operational requirement for real security given that almost nobody uses reproducible builds. In reality, people meaningfully depend on work done at the binary level to ensure lack of backdoors, not on work done at the source level.The preceding comment is saying that source security is insufficient, not that transparency is irrelevant.
by tptacek
12/28/2025 at 9:19:10 PM
Source availability is what makes a chain of trust possible that simply isn't meaningfully possible with closed source software, even with dynamic analysis, decompilation, reverse engineering, runtime network analysis with TLS decryption, etc.Both you and the preceding commenter are correct that just running binaries signed and distributed by Alphabet (Google) and/or Apple presents room for additional risks beyond those observable in the source code, but the solution to this problem isn't to say "and therefore source availability doesn't matter at all for anyone", it's to choose to build from source or to obtain and install APKs built and signed by the developers, such as via Accrescent or Obtanium (pulls directly from github, gitlab, etc releases).
There's a known-good path. Most people do not take the known-good path. Their choice to do so does not invalidate or eliminate the desirable properties of known-good path (verifiability, trustworthiness).
I genuinely do not understand the argument you and the other user are making. It reads to me like an argument that goes "Yes, there's a known, accurate, and publicly documented recipe to produce a cure for cancer, but it requires prerequisite knowledge to understand that most people lack, and it's burdensome to follow the recipe, so most people just buy their vials from the untrustworthy CancerCureCorporation, who has the ability to give customers a modified formula that keeps them sick rather than giving them the actual cure, and almost nobody makes the cure themselves without going through this untrustworthy but ultimately optional intermediary, so the public documentation of the cure doesn't matter at all, and there's no discernable difference between having the cure recipe and not having the cure recipe."
by anonym29
12/28/2025 at 10:20:51 PM
No, you're completely off the rails from the first sentence. It is absolutely possible --- in some ways more possible[†] --- to make a chain of trust without source availability. Your premise is that "reverse engineering" is somehow incomplete or lossy with respect to uncovering software behavior, and that simply isn't true.[†] Source is always good to have, but it's insufficient.
by tptacek
12/28/2025 at 10:57:12 PM
Never once anywhere in this thread have I claimed that source code alone is sufficient by itself to establish a chain of trust, merely that it is a necessary prerequisite to establish a chain of trust.That said, you seem to be refuting even that idea. While your reputation precedes you, and while I haven't been in the field quite as long as you, I do have a few dozen CVEs, I've written surreptitious side channel backdoors and broken production cryptographic schemes in closed-source software doing binary analysis as part of a red team alongside former NCC folks. I don't know a single one of them who would say that lacking access to source code increases your ability to establish a chain of trust.
Can you please explain how lacking access to source code, being ONLY able to perform dynamic analysis, rather than dynamic analysis AND source code analysis, can ever possibly lead to an increase in the maximum possible confidence in the behavior of a given binary? That sounds like a completely absurd claim to me.
by anonym29
12/28/2025 at 10:59:09 PM
I see what's happening. You're working under the misapprehension that static analysis is only possible with source code. That's not true. In fact: a great deal of real-world vulnerability research is performed statically in a binary setting.There's a lot of background material I'd have to bring in to attempt to bring you up to speed here, but my favorite simple citation here is just: Google [binary lifter].
by tptacek
12/28/2025 at 11:07:54 PM
This assumption about me is not accurate at all, I've done static analysis professionally on CIL, on compiled bytecode, and on source code. Instead of being condescending and patronizing to someone you don't know that you've made factually inaccurate assumptions about, can you please explain how having just a binary and no access to source code gives you more information about, greater confidence in, and a stronger basis for trust in the behavior of a binary than having access to the binary AND the source code used to build it?by anonym29
12/28/2025 at 11:19:35 PM
I have no idea who you are and can only work from what you write here, and with this comment, what you've written no longer makes sense. The binary (or the lifted IR form of the binary or the control flow graph of the binary or whatever form you're evaluating) is the source of truth about what a program actually does, not the source code.The source code is just a set of hints about what the binary does. You don't need the hints to discern what a binary is doing.
by tptacek
12/29/2025 at 12:22:36 AM
I'm not refuting that the binary is the source of truth about behavior, I never stated it wasn't, and I don't know where you even got the idea that I wasn't. It's been very frustrating to have to repeatedly do this - you and akerl_ have both been attacking strawman positions I do not hold and never stated, and being condescending and patronizing in the process. Is it possible you're making assumptions about me based on arguments made by other people that sound similar to the ones I'm making? I'd really appreciate not having to keep reminding you that I've never made the claims you're implying I'm making, if that's not too much to ask of you.At a high level, what I'm fundamentally contending is that WhatsApp is less trustworthy and secure than Signal. I can have a higher degree of confidence in the behavior and trustworthiness of the Signal APK I build from source myself than I can from WhatsApp, which I can't even build a binary of myself. I'd simply be given a copy of it from Google Play or Apple's App Store.
Signal's source code exhibits known trustworthy behavior, i.e. not logging both long-term and ephemeral cryptographic keys and shipping them off to someone else's servers. Sure, Google Play and Apple can modify this source code, add a backdoor, and the binary distributed by Google Play and Apple can have behavior that doesn't match the behavior of the published source code. You can detect this fairly easily, because you have a point of reference to compare to. You know what the compiled bytecode from the source code you've reviewed looks like, because you can build it yourself, no trust required[1], it's not difficult to see when that differs in another build.
With WhatsApp, you don't even have a point of reference of known good behavior, i.e. not logging both long-term and ephemeral cryptographic keys and shipping them off to someone else's server, in the first place. You can monitor all the disk writes, you can monitor all the network activity. Just because YOU don't observe cryptographic keys being logged, either in-memory, or on disk, or being sent off to some other server, doesn't mean there isn't code present to perform those exact functions under conditions you've never met and never would - it's entirely technically feasible for Google and Apple to be fingerprinting a laundry list of identifiers of known security researchers and be shipping them binaries with behavior that differs from the behavior of ordinary users, or even for them to ship targeted backdoored binaries to specific users at the demand of various intelligence agencies.
The upper limit for the trustworthiness of a Signal APK you build from source yourself is on a completely different planet from the trustworthiness of a WhatsApp APK you only have the option of receiving from Google.
And again, none of this even begins to factor in Meta's extensive track record on deliberately misleading users on privacy and security through deceptive marketing and subverting users' privacy extensively. Onavo wasn't just capturing all traffic, it was literally doing MITM attacks against other companies' analytics servers with forged TLS certificates. Meta was criminally investigated for this and during discovery, it came out that executives understood what was going on, understood how wrong it was, and deliberately continued with the practice anyway. Actual technical analysis of the binaries and source code aside, it's plainly ridiculous to suggest that software made by that same corporation is as trustworthy as Signal. One of these apps is a messenger made by a company with a history of explicitly misleading users with deceptive privacy claims and employing non-trivial technical attacks against their own users to violate their own users' privacy, the other is made by a nonprofit with a track record of being arguably one of the single largest contributors to robust, accessible, audited, verifiable secure cryptography in the history of the field. I contend that suggesting these two applications are equally secure is irrational, impossible to demonstrate or verify, and indefensible.
[1] Except in your compiler, linker, etc... Ken Thompson's 'Reflections on Trusting Trust' still applies here. The argument isn't that source code availability automatically means 100% trustworthy, it means the upper boundary for trustworthiness is higher than without source availability.
by anonym29
12/29/2025 at 12:59:51 AM
It's clear we're not going to agree on the technical discussion, but I do want to reply to the claim that I've been strawmanning you.I've been largely ignoring your sideline commentary about not trusting Meta and their other work outside of WhatsApp. Mostly because the whole thrust of my argument is that an app's security is confirmed by analyzing what the code does, not by listening to claims from the author.
Beyond that, I've been commenting in good faith about the core thrust of our disagreement, which is whether or not a lack of available source code disqualifies WhatsApp as a viable secure messaging option alongside Signal.
As part of that, I had to respond midway through because you put a statement in quotation marks that was not actually something I'd said.
by akerl_
12/29/2025 at 2:55:20 AM
Sorry, no, I'm not going to pick this apart. You wrote:Can you please explain how lacking access to source code, being ONLY able to perform dynamic analysis, rather than dynamic analysis AND source code analysis, can ever possibly lead to an increase in the maximum possible confidence in the behavior of a given binary?
This doesn't make sense, because not having source code doesn't limit you to dynamic analysis. I assumed, 2 comments back, you were just misunderstanding SOTA reversing; you got mad at me about that. But the thing you "never stated it wasn't" is right there in the comment history. Acknowledge that and help me understand where the gap was, or this isn't worth all the words you're spending on it.
by tptacek
12/28/2025 at 9:36:57 PM
> but the solution to this problem isn't to say "and therefore source availability doesn't matter at all for anyone"Thankfully, I didn’t say that.
by akerl_
12/28/2025 at 9:44:03 PM
Great, then it sounds like we agree: your original equivalence of Signal and WhatsApp was misguided, since one offers a verifiable chain of trust that starts with source availability and the other doesn't, to say nothing of the lengthy history of untrustworthiness and extensive, deliberate privacy violations of the company that owns and maintains WhatsApp, right?by anonym29
12/28/2025 at 9:55:43 PM
No, we don’t agree. There are things that source code is good for, but validating the presence or absence of illicit data stealing code in apps delivered to consumers is not one of those things. For that, source code can show you obvious malfeasance, but since it’s not enough to rule out obvious malfeasance, you’re stuck going to analysis of the compiled app in both cases.The population of users who have a verifiable path from an open source repo to an app on their device is a rounding error in the set of humans using messaging apps.
by akerl_
12/28/2025 at 10:16:11 PM
I think we've both made our positions clear. From my perspective, you're continuing to heavily cite user statistics that are irrelevant to the properties of verifiability or trustworthiness of the applications themselves, the goalposts I am discussing keep being moved, and there is a repeated pattern of neglect to address the points I'm raising. Readers can judge for themselves. Curious readers should also read about the history of Meta's Onavo VPN software and resulting lawsuits and settlements in evaluating the credibility of Meta's privacy marketing.by anonym29
12/28/2025 at 10:34:03 PM
Just to be crystal clear about the goalposts: I said at the start of this chain that if somebody wants secure messaging, they should use Signal or WhatsApp.You raised concerns about lack of source availability, and I’ve been consistent in my replies that source availability is not the way that somebody wants secure messaging is going to know they’re getting it. They’re going to get it because they’re using a popular platform with robust primitives, whose compiled/distributed apps receive constant scrutiny from security researchers.
Signal and WhatsApp are that. Concerns about Meta’s other work are just noise, in part because analysis of the WhatsApp distributed binaries doesn’t rely on promises from Meta.
by akerl_
12/27/2025 at 9:48:03 PM
Saw it, not impressed, GnuPG has a lot of more features than signing and file encryption.And there are lots of tools for file encryption anyways. I have a bash function using openssh, sometimes I use croc (also uses PAKE), etc.
I need an alternative to "gpg --encrypt --armor --recipient <foo>". :)
by johnisgood
12/27/2025 at 9:51:04 PM
I guess we'll have to live with you being unimpressed.by akerl_
12/27/2025 at 9:55:47 PM
> I need an alternative to "gpg --encrypt --armor --recipient <foo>"That's literally age.
by some_furry
12/27/2025 at 10:02:47 PM
No, because there is no keyring and you have to supply people's public key each time. It is not suitable for large-scale public key management (with unknown recipients), and it does not support automatic discovery, trust management. Age does NOT SUPPORT signing at all either.by johnisgood
12/27/2025 at 10:05:43 PM
Why is a keyring important to you?Would "fetch a short-lived age public key" serve your use case? If so, then an age plugin that build atop the AuxData feature in my Fediverse Public Key Directory spec might be a solution. https://github.com/fedi-e2ee/public-key-directory-specificat...
But either way, you shouldn't have long-lived public keys used for confidentiality. It's a bad design to do that.
by some_furry
12/27/2025 at 10:10:45 PM
> you shouldn't have long-lived public keys used for confidentiality.This statement is generic and misleading. Using long-lived keys for confidentiality is bad in real-time messaging, but for non-ephemeral use cases (file encryption, backups, archives) it is completely fine AND desired.
> Would "fetch a short-lived age public key" serve your use case?
Sadly no.
by johnisgood
12/27/2025 at 10:24:44 PM
(This is some_furry, I'm currently rate-limited. I thought this warranted a reply, so I switched to this account to break past the limit for a single comment.)> This statement is generic and misleading.
It may be generic, but it's not misleading.
> Using long-lived keys for confidentiality is bad in real-time messaging, but for non-ephemeral use cases (file encryption, backups, archives) it is completely fine.
What exactly do you mean by "long-lived"?
The "lifetime" of a key being years (for a long-lived backup) is less important than how many encryptions are performed with said key.
The thing you don't want is to encrypt 2^50 messages under the same key. Even if it's cryptographically safe to do that, any post-compromise key rotation will be a fucking nightmare.
The primary reason to use short-lived public keys is to limit the blast radius. Consider these two companies:
Alice Corp. uses the same public key for 30+ years.
Bob Ltd. uses a new public key for each quarter over the same time period.
Both parties might retain the secret key indefinitely, so that if Bob Ltd. needs to retrieve a backup from 22 years ago, they still can.
Now consider what happens if both of them lose their currently-in-use secret key due to a Heartbleed-style attack. Alice has 30 years of disaster recovery to contend with, while Bob only has up to 90 days.
Additionally, file encryption, backups, and archives typically use ephemeral symmetric keys at the bottom of the protocol. Even when a password-based key derivation function is used (and passwords are, for whatever reason, reused), the password hashing function usually has a random salt, thereby guaranteeing uniqueness.
The idea that "backups" magically mean "long-lived" keys are on the table, without nuance, is extremely misleading.
> > Would "fetch a short-lived age public key" serve your use case?
> Sadly no.
shrug Then, ultimately, there is no way to securely satisfy your use case.
by soatok
12/27/2025 at 10:52:30 PM
You introduced "short-lived" vs "long-lived", not me. Long-lived as wall-clock time (months, years) is the default interpretation in this context.The Alice / Bob comparison is asymmetric in a misleading way. You state Bob Ltd retains all private keys indefinitely. A Heartbleed-style attack on their key storage infrastructure still compromises 30 years of backups, not 90 days. Rotation only helps if only the current operational key is exposed, which is an optimistic threat model you did not specify.
Additionally, your symmetric key point actually supports what I said. If data is encrypted with ephemeral symmetric keys and the asymmetric key only wraps those, the long-lived asymmetric key's exposure does not enable bulk decryption without obtaining each wrapped key individually.
> "There is no way to securely satisfy your use case"
No need to be so dismissive. Personal backup encryption with a long-lived key, passphrase-protected private key, and offline storage is a legitimate threat model. Real-world systems validate this: SSH host keys, KMS master keys, and yes, even PGP, all use long-lived asymmetric keys for confidentiality in non-ephemeral contexts.
And to add to this, incidentally, age (the tool you mentioned) was designed with long-lived recipient keys as the expected use case. There is no built-in key rotation or expiry mechanism because the authors considered it unnecessary for file encryption. If long-lived keys for confidentiality were inherently problematic, age would be a flawed design (so you might want to take it up with them, too).
In any case, yeah, your point about high-fan-out keys with large blast radius is correct. That is different from "long-lived keys are bad for confidentiality" (see above with regarding to "age").
by johnisgood
12/27/2025 at 11:18:56 PM
An intended use case for FOKS (https://foks.pub) is to allow long-lived durable shared secrets between users and teams with key rotation when needed.by maxtaco
12/28/2025 at 6:09:21 PM
> The Alice / Bob comparison is asymmetric in a misleading way. You state Bob Ltd retains all private keys indefinitely. A Heartbleed-style attack on their key storage infrastructure still compromises 30 years of backups, not 90 days.No. Having 30 years of secret keys at all is not the same of having 30 years of secret keys in memory.
by some_furry
12/28/2025 at 4:29:45 AM
>Personal backup encryption with a long-lived key, passphrase-protected private key, and offline storage is a legitimate threat model... If you're going to use a passphrase anyway why not just use a symmetric cipher?
In fact for file storage why not use an encrypted disk volume so you don't need to use PGP?
by stackghost
12/28/2025 at 7:57:59 AM
That was just me being goofy in that bit (and only that), but I hope the rest of my message went across. :)> In fact for file storage why not use an encrypted disk volume so you don't need to use PGP?
Different threat models. Disk encryption (LUKS, VeraCrypt, plain dm-crypt) protects against physical theft. Once mounted, everything is plaintext to any process with access. File-level encryption protects files at rest and in transit: backups to untrusted storage, sharing with specific recipients, storing on systems you do not fully control. You cannot send someone a LUKS volume to decrypt one file, and backups of a mounted encrypted volume are plaintext unless you add another layer.
by johnisgood
12/28/2025 at 9:34:35 PM
>You cannot send someone a LUKS volume to decrypt one file, and backups of a mounted encrypted volume are plaintext unless you add another layer.Veracrypt, and I'm sure others, allow you to do exactly this. You can create a disk image that lives in a file (like a .iso or .img) and mount/unmount it, share it, etc.
by stackghost
12/28/2025 at 9:38:48 PM
That’s not what they said. They’re saying you often want to give someone a specific file from a disk, rather than the whole set of files.by akerl_
12/29/2025 at 4:47:12 AM
You can still do that with a .dmg, for example. I've done it, it works more or less like a zip.But even if that was somehow unreasonable or undesired, you can use Filippo's age for that. PGP has no use case that isn't done better by some other tool, with the possible exception of "cosplay as a leet haxor"
by stackghost
12/28/2025 at 10:26:19 AM
We need a keyring at a company. Because there's no other media for communicating, where you reach management and technical people in companies as well.And we have massive issues due to the fact that the ongoing-decrying of "shut everything off" and the following non-improvement-without-an-alternative because we have to talk with people of other organizations (and every organization runs their own mailserver) and the only really common way of communication is Mail.
And when everyone has a GPG Key, you get.. what? an keyring.
You could say, we do not need gpg, because we control the mailserver, but what if a mailserver is compromised and the mails are still in mailboxes?
the public keys are not that public, only known to the contenders, still, it's an issue and we have a keyring
by deknos
12/28/2025 at 8:45:24 PM
You need a private PKI, not keyring. They're subtly different - a PKI can handle key rotation, etc.Yes there aren't a lot of good options for that. If you're using something like a Microsoft software stack with active directory or similar identity/account management then there's usually some PKI support in there to anchor to.
Across organisations, there's really very very few good solutions. GPG specifically is much too insecure when you need to receive messages from untrusted senders. There's basically S/MIME which have comparable security issues, then we have AD federation or Matrix.org with a server per org.
> You could say, we do not need gpg, because we control the mailserver, but what if a mailserver is compromised and the mails are still in mailboxes?
How are you handling the keys? This is only true if user's protect their own keypairs with strong passwords / yubikey applet, etc.
by Natanael_L
12/28/2025 at 5:41:19 PM
> We need a keyring at a company.Look closely at the UX I'm proposing in https://github.com/fedi-e2ee/pkd-client-php?tab=readme-ov-fi...
Tell me why this won't work for your company.
by some_furry
12/28/2025 at 2:48:51 AM
> you have to supply people's public key each timeKeyrings are awful. I want to supply people’s public keys each time. I have never, in my entire time using cryptography, wanted my tool to guess or infer what key to verify with. (Heck, JOSE has a long history of bugs because it infers the key type, which is also a mistake.)
I have an actual commercial use case that receives messages (which are, awkwardly, files sent over various FTP-like protocols, sigh), decrypts and verifies them, and further processes them. This is fully automated and runs as a service. For horrible legacy reasons, the files are in PGP format. I know the public key with which they are signed (provisioned out of band) and I have the private key for decryption (again, provisioned out of band).
This would be approximately two lines of code using any sane crypto library [0], but there really isn’t an amazing GnuPG alternative that’s compatible enough.
But GnuPG has keyrings, and it really wants to use them and to find them in some home directory. And it wants to identify keys by 32-bit truncated hashes. And it wants to use Web of Trust. And it wants to support a zillion awful formats from the nineties using wildly insecure C code. All of this is actively counterproductive. Even ignoring potential implementation bugs, I have far more code to deal with key rings than actual gpg invocation for useful crypto.
[0] I should really not have to even think about the interaction between decryption and verification. Authenticated decryption should be one operation, or possibly two. But if it’s two, it’s one operation to decapsulate a session key and a second operation to perform authenticated decryption using that key.
by amluto
12/28/2025 at 12:52:08 PM
Some years ago I wrote "just a little script" to handle encrypting password-store secrets for multiple recipients. It got quite ugly and much more verbose than planned, switching gpg output parsing to Python for sanity. I think I used a combination of --keyring <mykeyring> --no-default-keyring. Never would encourage anyone to do this again.by mkesper
12/28/2025 at 12:21:17 PM
>And it wants to identify keys by 32-bit truncated hashes.That's 64 bits these days.
>I should really not have to even think about the interaction between decryption and verification.
Messaging involves two verifications. One to insure that you are sending the message to who you think you are sending the message. The other to insure that you know who you received a message from. That is an inherent problem. Yes, you can use a shared key for this but then you end up doing both verifications manually.
by upofadown
12/28/2025 at 2:16:41 PM
>> And it wants to identify keys by 32-bit truncated hashes.> That's 64 bits these days.
The fact that it’s short enough that I even need to think about whether it’s a problem is, frankly, pathetic.
> Messaging involves two verifications. One to insure that you are sending the message to who you think you are sending the message. The other to insure that you know who you received a message from. That is an inherent problem. Yes, you can use a shared key for this but then you end up doing both verifications manually.
I can’t quite tell what you mean.
One can build protocols that do encrypt-then-sign, encrypt-and-sign, sign-then-encrypt, or something clever that combines encryption and signing. Encrypt-then-sign has a nice security proof, the other two combinations are often somewhat catastrophically wrong, and using a high quality combination can have good performance and nice security proofs.
But all of the above should be the job of the designer of a protocol, not the user of the software. If my peer sends me a message, I should provision keys, and then I should pass those keys to my crypto library along with a message I received (and perhaps whatever session state is needed to detect replays), and my library should either (a) tell me that the message is invalid and not give me a guess as to its contents or (b) tell me it’s valid and give me the contents. I should not need to separately handle decryption and verification, and I should not even be able to do them separately even if I want to.
by amluto
12/28/2025 at 4:37:44 PM
>The fact that it’s short enough that I even need to think about whether it’s a problem is, frankly, pathetic.Please resist the temptation to personally attack others.
I think you mean that 64 bits of hash output could be trivially collided using, say, Pollard's rho method. But it turns out that simple collisions are not an issue for such hashes used as identities. The fact that PGP successfully used 32 bits (16 bits of effort for a collision) for so long is actually a great example of the principle.
>...encrypt-then-sign, encrypt-and-sign, sign-then-encrypt...
You mean encrypt-then-MAC here I think.
>...I should not even be able to do them separately even if I want to.
Alas that is not possible. The problem is intrinsic to end to end encrypted messaging. Protocols like PGP combine them into a single key fingerprint so that the user does not have to deal with them separately. You still have to verify the fingerprint for people you are sending to and the fingerprint for the people who send you messages.
by upofadown
12/28/2025 at 5:53:10 PM
They didn't personally attack you. They (correctly) attacked 64-bit identifiers.by tptacek
12/28/2025 at 9:14:14 PM
They were attacking an entire community. Perhaps I should have complained about being deliberately provocative.But to the point, how long should something like a key fingerprint be?
by upofadown
12/28/2025 at 10:42:13 PM
> How long should something like a key fingerprint be?At least 128 bits for most threat models. 192+ is preferable for mine.
https://soatok.blog/2024/07/01/blowing-out-the-candles-on-th...
My threat model assumes you want an attacker advantage of less than 2^-64 after 2^64 keys exist to be fingerprinted in the first place, and your threat model includes collisions.
If I remember correctly, cloud providers assess multi-user security by assuming 2^40 users which each will have 2^50 keys throughout their service lifetime.
If you round down your assumption to 2^34 users with at most 100 public keys on average (for a total of 2^41 user-keys), you can get away with 2^-41 after 2^41 at about 123 bits, which for simplicity you can round up to the nearest byte and arrive at 128 bits.
The other thing you want to keep in mind is, how large are the keys in scope? If you have 4096-bit RSA keys and your fingerprints are only 64 bits, then by the pigeonhole principle we expect there to be 2^4032 distinct public keys with a given fingerprint. The average distance between fingerprints will be random (but you can approximate it to be an order of magnitude near 2^32).
In all honesty, fingerprints are probably a poor mechanism.
by some_furry
12/29/2025 at 11:41:32 AM
>...and your threat model includes collisions.OK, to be clear, I am specifically contending that a key fingerprint does not include collisions. My proof is empirical, that no one has come up with an attack on 64 bit PGP key fingerprints.
Collisions mean that an attacker can generate two or more messaging identities with the same fingerprint. How would that help them in some way?
by upofadown
12/28/2025 at 11:22:46 PM
> attacker advantage of less than 2^-64Why so high? Computers are fast and massively parallel these days. If a cryptosystem fully relies on fingerprints, a second preimage of someone’s fingerprint where the attacker knows the private key for the second preimage (or it’s a cleverly corrupt key pair) catastrophically breaks security for the victim. Let’s make this astronomically unlikely even in the multiple potential victim case.
And it’s not like 256 bit hashes are expensive.
(I’m not holding my breath on fully quantum attacks using Grover’s algorithm, at high throughput, against billions of users, so we can probably wait a while before 256 bits feels uncomfortably short.)
by amluto
12/29/2025 at 11:48:58 AM
>And it’s not like 256 bit hashes are expensive.A key fingerprint is a usability feature. It has no other purpose. Otherwise we would just use the public key. Key fingerprints have to be kept as short as possible. So the question is, how short can that be? I would argue that 256 bit key fingerprints are not really usable.
Signal messenger is using 100 bits for their key fingerprint. They combine two to make a 60 digit decimal number. Increasing that to 256 x 2 bits would mean that they would end up with 154 decimal digits. That would be completely unusable.
by upofadown
12/29/2025 at 12:43:28 AM
I was asked about the minimum value, and gave my explanation for why some values could be considered the minimum. By all means, use 256-bit fingerprints.by some_furry
12/28/2025 at 10:18:55 PM
No, again, they were attacking 64-bit identifiers.by tptacek
12/28/2025 at 10:46:19 PM
> I think you mean that 64 bits of hash output could be trivially collided using, say, Pollard's rho method. But it turns out that simple collisions are not an issue for such hashes used as identities.No. I mean that 64 bits can probably be inexpensively attacked to produce first or second preimages.
It would be nice if a decentralized crypto system had memorable key identifiers and remained secure, but I think that is likely to be a pipe dream. So a tool like gpg shouldn’t even try. Use at least 128 bits and give three choices: identify keys by an actual secure hash or identify them by a name the user assigns or pass them directly. Frankly I’m not sure why identifiers are even useful — see my original complaint about keyrings.
>> ...I should not even be able to do them separately even if I want to.
>Alas that is not possible. The problem is intrinsic to end to end encrypted messaging. Protocols like PGP combine them into a single key fingerprint so that the user does not have to deal with them separately.
Huh? It’s possible. It’s not even hard. It could work like this:
$ better_gpg decrypt_and_auth --sender_pubkey [KEY] --recipient_privkey [KEY]
Ciphertext input is supplied on stdin. Plaintext output appears on stdout but only if the message validates correctly.
by amluto
12/29/2025 at 11:58:23 AM
>I mean that 64 bits can probably be inexpensively attacked to produce first or second preimages.Keep in mind that you would have to generate a valid keypair, or something that could be made into a valid keypair for each iteration. That fact is why PGP got along with 32 bit key IDs for so long. PGP would still be using 32 bit key IDs if it wasn't that someone figured out how to mess with RSA exponents to greatly speed up the process. Ironically, the method with the slowest keypair generation became the limiting factor.
It isn't like this is a new problem. People have been designing and using key fingerprint schemes for over a quarter of a century now.
>$ better_gpg decrypt_and_auth --sender_pubkey [KEY] --recipient_privkey [KEY]
How do you know that the recipient key actually belongs to the recipient? How does the recipient know that the sender key actually belongs to you (so it will validate correctly)?
by upofadown
12/28/2025 at 8:32:46 PM
What you described IS WHY age is the better option.GPG's keyring handling has also been a source of exploits. It's much safer to directly specify recipient rather than rely on things like short key IDs which can be bruteforced.
Automatic discovery simply isn't secure if you don't have an associated trust anchor. You need something similar to keybase or another form of PKI to do that. GPG's key servers are dangerous.
You technically can sign with age, but otherwise there's minisign and the SSH spec signing function
by Natanael_L
12/30/2025 at 5:12:20 AM
And when do you need any of that stuff?As a followup, is there anything in existence that supports "large-scale public key management (with unknown recipients)"? Or "automatic discovery, trust management"? Even X.509 PKI at its most delusional doesn't claim to be able to do that.
by pseudohadamard
12/27/2025 at 10:06:21 PM
sq (sequoia) should be able to sort that.by baobun
12/27/2025 at 10:19:28 PM
I know, I have been using it recently.by johnisgood
12/28/2025 at 4:12:29 PM
What is the alternative to PGP for the specific use case of secure email? That doesn't mandate dealing with the X509 certificate bureaucracy?by benchloftbrunch
12/28/2025 at 4:30:07 PM
Don't encrypt email.https://www.latacora.com/blog/2020/02/19/stop-using-encrypte...
by tptacek
12/28/2025 at 9:14:56 PM
The only alternative suggested by the linked article is giving up email completely in favor of centralized solutions like Signal. My short answer is “no”. My long answer is: <https://news.ycombinator.com/item?id=45390332>by teddyh
12/28/2025 at 10:17:41 PM
I wrote the linked article. I don't care what secure messenger you use. But if you choose encrypted email over Signal because "centralization", you're LARPing. The first criteria for a secure messenger has to be that it is plausibly secure, and email isn't. You'd use encrypted email (for "decentralization") because you understand the cost of losing the plaintext of your message is nil. If you tell strangers to do that, without certainty that their messages are also valueless, you're committing malpractice.by tptacek
12/30/2025 at 5:21:31 AM
Something that doesn't require securing email. Both S/MIME and PGP were solutions for 1980s problems (TFA is slightly off about PGP's start date, the PGP design dates from 1987 and MSDOS, not the 1990s, and S/MIME via PEM is from 1986). They're pretty much irrelevant today because almost all email is encrypted anyway via StartTLS and if you need full end-to-end encryption you use Signal or something similar.by pseudohadamard
12/28/2025 at 9:15:14 PM
What's your usecase here? Internal or external messaging?by Natanael_L
12/30/2025 at 5:25:23 AM
Use case? We're crypto LARPing dammit, we don't need a use case!by pseudohadamard
12/28/2025 at 8:58:40 AM
The thing I can't get past with PGP / GPG is that it tries to work around MITM attacks by encouraging users to place their social network on the public record (via public key attestation).This is so insane to me. The whole point of using cryptography is to keep private information private. Its hard to think of ways PGP could fail more as a security / privacy tool.
by josephg
12/28/2025 at 12:12:34 PM
Do you mean keyservers? Keyservers have nothing to do with the identity verification required to prevent MITM attacks. There is only one method available for PGP. Comparison of key fingerprints/IDs.Keyservers are simply a convenient way to get a public key (identity). Most people don't have to use them.
by upofadown
12/27/2025 at 9:48:57 PM
Depending on what you are after, an alternative could be using SSH keys for signatures and age[1] for encryption targeting SSH keys.by coppsilgold
12/27/2025 at 10:02:35 PM
sq (sequoia) is compatible and is available in your favorite distro. It's the recommended replacement.by baobun
12/28/2025 at 7:08:52 AM
This is the right answer.The problem mostly concerns the oldest parts of PGP (the protocol), which gpg (the implementation) doesn't want or cannot get rid of.
by zimmerfrei
12/28/2025 at 5:01:18 PM
age https://github.com/FiloSottile/ageby lagniappe
12/28/2025 at 1:59:24 PM
ageby vbezhenar
12/27/2025 at 8:12:03 PM
It's a fundamentally bad idea to have a single key that applications are supposed to look for in a particular place, and then use to sign things. There is inherent complexity involved in making multi-context key use safe, and it's better to just avoid it architecturally.Keys (even quantum safe) are small enough that having one per application is not a problem at all. If an application needs multi-context, they can handle it themselves. If they do it badly, the damage is contained to that application. If someone really wants to make an application that just signs keys for other applications to say "this is John Smith's key for git" and "this is John Smith's key for email" then they could do that. Such an application would not need to concern itself with permissions for other applications calling into it. The user could just copy and paste public keys, or fingerprints when they want to attest to their identity in a specific application.
The keyring circus (which is how GPG most commonly intrudes into my life) is crazy too. All these applications insist on connecting to some kind of GPG keyring instead of just writing the secrets to the filesystem in their own local storage. The disk is fully encrypted, and applications should be isolated from one another. Nothing is really being accomplished by requiring the complexity of yet another program to "extra encrypt" things before writing them to disk.
I'm sure these bad ideas come from the busy work invented in corporate "security" circles, which invent complexity to keep people employed without any regard for an actual threat model.
by alphazard
12/27/2025 at 8:56:37 PM
> The disk is fully encrypted, and applications should be isolated from one another.For most apps on non-mobile devices, there isn't filesystem isolation between apps. Disk/device-level encryption solves for a totally different threat model; Apple/Microsoft/Google all ship encrypted storage for secrets (Keychain, Credential Manager, etc), because restricting key material access within the OS has merit.
> I'm sure these bad ideas come from the busy work invented in corporate "security" circles, which invent complexity to keep people employed without any regard for an actual threat model.
Basically everything in PGP/GPG predates the existence of "corporate security circles".
by akerl_
12/28/2025 at 4:35:15 AM
> For most apps on non-mobile devices, there isn't filesystem isolation between apps.If there isn't there should be. At least my Flatpaks are isolated from each other.
> Apple/Microsoft/Google all ship encrypted storage for secrets (Keychain, Credential Manager, etc), because restricting key material access within the OS has merit.
The Linux equivalents are suspicious and stuck in the past to say the least. Depending on them is extra tedious on top of the tediousness of any PGP keyrings, god forbid a combination of the two.
> Basically everything in PGP/GPG predates the existence of "corporate security circles".
Then we know where this stuff came from.
by Avamander
12/28/2025 at 5:18:35 AM
> Then we know where this stuff came from.I can’t figure out what you mean by this.
by akerl_
12/28/2025 at 2:58:59 PM
Just a joke that if indeed GPG predates and was not inspired by corporate security theatre then the opposite must be true. That corporate security theatre was inspired by GPG/PGP.by Avamander
12/28/2025 at 10:36:05 AM
and now certain people in corporate security only trust gpg, because they grew up with it :Dby deknos
12/27/2025 at 7:23:18 PM
These are not vulnerabilities in the "remote exploit" sense. They should be taken seriously, you should be careful not to run local software on untrusted data, and GPG should probably do more to protect users from shooting themselves in the foot, but the worst thing you could do is panic and throw out a process your partners and colleagues trust. There is nothing here that will disturb your workflow signing commits or apt-get install-ing from your distribution.If you use crypographic command line tools to verify data sent to you, be mindful on what you are doing and make sure to understand the attacks presented here. One of the slides is titled "should we even use command line tools" and yes, we should because the alternative is worse, but we must be diligent in treating all untrusted data as adversarial.
by xorcist
12/27/2025 at 7:36:00 PM
A huge part of GPG’s purported use case is getting a signed/encrypted/both blob from somebody and using GPG to confirm it’s authentic. This is true for packages you download and for commits with signatures.Handling untrusted input is core to that.
by akerl_
12/27/2025 at 7:42:10 PM
It is, and other software handling untrusted data should also treat it as adversarial. For example, your package tool should probably not output raw package metadata to the terminal.by xorcist
12/27/2025 at 7:45:31 PM
I think you’re missing the forest for the trees.by akerl_
12/27/2025 at 10:39:00 PM
It reads to me like attempting to verify a malicious ascii-armoured signature is potential RCE.by tgsovlerkhgsel
12/27/2025 at 6:53:54 PM
I did the switch this year after getting yet another personal computer. I have 4 in total (work laptop, personal sofa laptop, Mac Mini, Linux Tower). I used Yubi keys with gpg and resident ssh keys. All is fine but the configuration needed to get it too work on all the machines. I also tend to forget the finer details and have to relearn the skills of fetching the public keys into the keychain etc. I got rid of this all by moving to 1Password ssh agent and git ssh signing. Removes a lot of headaches from my ssh setup. I still have the yubi key(s) though as a 2nd factor for certain web services. And the gpg agent is still running but only as a fallback. I will turn this off next year.by larusso
12/28/2025 at 12:43:01 AM
I’ve ended up the same place as you. I had previously set up my gpg key on a Yubikey and even used that gpg key to handle ssh authentication. Then at some point it just stopped working, maybe the hardware on my key broke. 2FA still works though.In any case I figured storing an SSH key in 1Password and using the integrated SSH socket server with my ssh client and git was pretty nice and secure enough. The fact the private key never leaves the 1Password vault unencrypted and is synced between my devices is pretty neat. From a security standpoint it is indeed a step down from having my key on a physical key device, but the hassle of setting up a new Yubikey was not quite worth it.
I’m sure 1Password is not much better than having a passphrase-protected key on disk. But it’s a lot more convenient.
by snorremd
12/28/2025 at 6:19:17 AM
> I had previously set up my gpg key on a Yubikey and even used that gpg key to handle ssh authentication. Then at some point it just stopped working, maybe the hardware on my key brokeDid you try to SSH in verbose mode to ascertain any errors? Why did you assume the hardware "broke" without anyone objective qualifications of an actual failure condition?
> I figured storing an SSH key in 1Password and using the integrated SSH socket server with my ssh client and git was pretty nice and secure enough
How is trusting a closed-source, for-profit, subscription-based application with your SSH credential "secure enough"?
Choosing convenience over security is certainly not unreasonable, but claiming both are achieved without any compromise borders on ludicrous.
by DetectDefect
12/27/2025 at 8:52:38 PM
How is 1password safer than the local keychain?by hirako2000
12/27/2025 at 9:10:31 PM
The keys never leave the 1Password store. So you don’t have the keys on the local file system. That and that these keys are shared over the cloud was the seller for me. I guess security wise it’s a bit of a downgrade compared to resident keys. But the agent support agent forwarding etc which wasn’t really working with yubi ssh resident keys. Also worth mentioning that I use 1Password. Bitwarden has a similar feature as far as I know. For the ones who want to self host etc might be the even better solution.by larusso
12/27/2025 at 9:15:08 PM
> The keys never leave the 1Password store. So you don’t have the keys on the local file system.Keychain and 1Password are doing variants of the same thing here: both store an encrypted vault and then give you credentials by decrypting the contents of that vault.
by akerl_
12/27/2025 at 7:18:14 PM
> 1Password ssh agent and git ssh signingI’m still working through how to use this but I have it basically setup and it’s great!
by hk1337
12/27/2025 at 8:51:41 PM
> I certainly want to get rid of gpg from my life if I canI see this sentiment a lot, but you later hint at the problem. Any "replacement" needs to solve for secure key distribution. Signing isn't hard, you can use a lot of different things other than gpg to sign something with a key securely. If that part of gpg is broken, it's a bug, it can/should be fixed.
The real challenge is distributing the key so someone else can verify the signature, and almost every way to do that is fundamentally flawed, introduces a risk of operational errors or is annoying (web of trust, trust on first use, central authority, in-person, etc). I'm not convinced the right answer here is "invent a new one and the ecosystem around it".
by 65a
12/27/2025 at 9:18:16 PM
It's not like GPG solves for secure key distribution. GPG keyservers are a mess, and you can't trust their contents anyways unless you have an out of band way to validate the public key. Basically nobody is using web-of-trust for this in the way that GPG envisioned.This is why basically every modern usage of GPG either doesn't rely on key distribution (because you already know what key you want to trust via a pre-established channel) or devolves to the other party serving up their pubkey over HTTPS on their website.
by akerl_
12/27/2025 at 10:34:06 PM
Yes, not saying that web of trust ever worked. "Pre-established channel" are the other mechanisms I mentioned, like a central authority (https) or TOFU (just trust the first key you get). All of these have some issues, that any alternative must also solve for.by 65a
12/27/2025 at 10:51:53 PM
So if we need a pre-established channel anyways, why would people recommending a replacement for GPG workflows need to solve for secure key distribution?This is a bit like looking at electric cars and saying ~"well you can't claim to be a viable replacement for gas cars until you can solve flight"
by akerl_
12/27/2025 at 8:56:13 PM
A lot of people are using PGP for things that don’t require any kind of key distribution. If you’re just using it to encrypt files (even between pointwise parties), you can probably just switch to age.(We’re also long past the point where key distribution has been a significant component of the PGP ecosystem. The PGP web of trust and original key servers have been dead and buried for years.)
by woodruffw
12/27/2025 at 9:06:50 PM
This is not the first time I see "secure key distribution" mentioned in HN+(GPG alternatives) context and I'm a bit puzzled.What do you mean? Web of Trust? Keyservers? A combination of both? Under what use case?
by kaoD
12/27/2025 at 9:22:00 PM
I'm assuming they mean the old way of signing each others signatures.As a practical implementation of "six degrees of Kevin Bacon", you could get an organic trust chain to random people.
Or at least, more realistically, to few nerds. I think I signed 3-4 peoples signatures.
The process had - as they say - a low WAF.
by kpil
12/27/2025 at 10:31:49 PM
> As a practical implementation of "six degrees of Kevin Bacon", you could get an organic trust chain to random people.GPG is terrible at that.
0. Alice's GPG trusts Alice's key tautologically. 1. Alice's GPG can trust Bob's key because it can see Alice's signature. 2. Alice's GPG can trust Carol's key because Alice has Bob's key, and Carol's key is signed by Bob.
After that, things break. GPG has no tools for finding longer paths like Alice -> Bob -> ??? -> signature on some .tar.gz.
I'm in the "strong set", I can find a path to damn near anything, but only with a lot of effort.
The good way used to be using the path finder, some random website maintained by some random guy that disappeared years ago. The bad way is downloading a .tar.gz, checking the signature, fetching the key, then fetching every key that signed in, in the hopes somebody you know signed one of those, and so on.
And GPG is terrible at dealing with that, it hates having tens of thousands of keys in your keyring from such experiments.
GPG never grew into the modern era. It was made for persons who mostly know each other directly. Addressing the problem of finding a way to verify the keys of random free software developers isn't something it ever did well.
by dale_glass
12/27/2025 at 10:42:46 PM
What's funny about this is that the whole idea of the "web of trust" was (and, as you demonstrate, is) literally PGP punting on this problem. That's how they talked about it at the time, in the 90s, when the concept was introduced! But now the precise mechanics of that punt have become a critically important PGP feature.by tptacek
12/27/2025 at 11:02:01 PM
I don't think it punted as much as it never had that as an intended usage case.I vaguely recall the PGP manuals talking about scenarios like a woman secretly communicating with her lover, or Bob introducing Carol to Alice, and people reading fingerprints over the phone. I don't think long trust chains and the use case of finding a trust path to some random software maintainer on the other side of the planet were part of the intended design.
I think to the extent the Web of Trust was supposed to work, it was assumed you'd have some familiarity with everyone along the chain, and work through it step by step. Alice would known Bob, who'd introduce his friend Carol, who'd introduce her friend Dave.
by dale_glass
12/27/2025 at 10:40:18 PM
In a signature context, you probably want someone else to know that "you" signed it (I can think of other cases, but that's the usual one). The way to do that requires them to know that the key which signed the data belongs to you. My only point is that this is actually the hard part, which any "replacement" crypto system needs to solve for, and that solving that is hard (none of the methods are particularly good).by 65a
12/28/2025 at 4:42:09 AM
> The way to do that requires them to know that the key which signed the data belongs to you.This is something S/MIME does and I wouldn't say it doesn't do so well. You can start from mailbox validation and that already beats everything PGP has to offer in terms of ownership validation. If you do identity validation or it's a national PKI issuing the certificate (like in some countries) it's a very strong guarantee of ownership. Coughing baby (PGP) vs hydrogen bomb level of difference.
It much more sounds to me like an excuse to use PGP when it doesn't even remotely offer what you want from a replacement.
by Avamander
12/27/2025 at 11:56:16 PM
I think it should be mostly ad-hoc methods:if you have a website put your keys in a dedicated page and direct people there
If you are in an org there can be whatever kind of centralised repo
Add the hashes to your email signature and/or profile bios
There might be a nice uniform solution using DNS and derived keys like certificate chains? I am not sure but I think it might not be necessary
by afiori
12/29/2025 at 1:06:59 PM
I haven't gone through the list in detail, but I don't see anything there that implies the ability to forge a valid signature without the private key, which is what matters most for git commits.Most of the entries have to do with ways to compromise the unencrypted text presented to the user, so that the displayed message doesn't match the signed message. This allows for multiple different kinds of exploit.
But in the git commit case the main thing we care about, for commits authored by anyone whose signature we trust, is that the actual commit matches the signature, and git itself enforces that.
Of course, it's possible that a malicious user could construct a commit that expands to something misleading (with or without GPG). But that comes back to the point of signatures in the first place - if your repo allows random anonymous people to push signed commits, then you might have an issue.
by antonvs