1/31/2026 at 5:04:39 PM
WhatsApp's end-to-end encryption has been independently investigated: https://kclpure.kcl.ac.uk/ws/files/324396471/whatsapp.pdfFull version here: https://eprint.iacr.org/2025/794.pdf
We didn't review the entire source code, only the cryptographic core. That said, the main issue we found was that the WhatsApp servers ultimately decide who is and isn't in a particular chat. Dan Goodin wrote about it here: https://arstechnica.com/security/2025/05/whatsapp-provides-n...
by martinralbrecht
1/31/2026 at 8:03:13 PM
> We didn't review the entire source code And, you don't see the issue with that? Facebook was bypassing security measures for mobile by sending data to itself on localhost using websockets and webrtc.https://cybersecuritynews.com/track-android-users-covertly/
An audit of 'they can't read it cryptographically' but the app can read it, and the app sends data in all directions. Push notifications can be used to read messages.
by vpShane
1/31/2026 at 8:16:49 PM
> Push notifications can be used to read messages.Are you trying to imply that WhatsApp is bypassing e2e messaging through Push notifications?
Unless something has changed, this table highlights that both Signal and WhatsApp are using a "Push-to-Sync" technique to notify about new messages.
https://crysp.petsymposium.org/popets/2024/popets-2024-0151....
by miduil
1/31/2026 at 8:38:29 PM
Push-to-Sync. We observed 8 apps employ a push-to-sync strat- egy to prevent privacy leakage to Google via FCM. In this mitigation strategy, apps send an empty (or almost empty) push notification to FCM. Some apps, such as Signal, send a push notification with no data (aside from the fields that Google sets; see Figure 4). Other apps may send an identifier (including, in some cases, a phone num- ber). This push notification tells the app to query the app server for data, the data is retrieved securely by the app, and then a push notification is populated on the client side with the unencrypted data. In these cases, the only metadata that FCM receives is that the user received some message or messages, and when that push noti- fication was issued. Achieving this requires sending an additional network request to the app server to fetch the data and keeping track of identifiers used to correlate the push notification received on the user device with the message on the app server.by itsthecourier
2/1/2026 at 12:43:32 AM
Is that not still incredibly vulnerable to timing attacks?by chaps
2/1/2026 at 2:51:45 AM
Maybe I’m mis-interpreting what you mean, but without a notification when a message is sent, what would you correlate a message-received notification with?by xethos
2/3/2026 at 3:59:34 AM
No, I'm saying Meta can't be trusted.by vpShane
2/1/2026 at 12:20:02 AM
Nothing changed, but many people struggle to understand their our own degree of relative ignorance and overvalue high-level details that are leaky abstractions which make the consequentially dissimilar look superficially similar.by fasbiner
1/31/2026 at 8:55:40 PM
Why did you not mention that the WhatsApp apk, even on non-google play installed devices, loads google tag manager's scripts?It is reproducibly loaded in each chat, and an MitM firewall can also confirm that. I don't know why the focus of audits like these are always on a specific part of the app or only about the cryptography parts, and not the overall behavior of what is leaked and transferred over the wire, and not about potential side channel or bypass attacks.
Transport encryption is useless if the client copies the plaintext of the messages afterwards to another server, or say an online service for translation, you know.
by cookiengineer
2/1/2026 at 12:01:30 PM
Things like this combined with the countless ways to hide "feature flags" in a giant codebase makes me feel that anything less than "the entire app was verified + there is literally no way to dynamically load code from remote (so even no in app browser) + we checked 5 years of old versions and plan to do this for the next 5 years of update" is particularly meaningful.Still very important but my issue has never been with zucks inability to produce solid software, rather in its intentions and so them being good engineers just makes them better at hiding bad stuff.
by afiori
2/1/2026 at 2:31:09 PM
Back in the days people called skype [1] spyware because it had lots of backdoors in it and lots of undocumented APIs that shouldn't have been in there.The funny part was that skype was probably the most obfuscated binary that was ever available as "legitimate" software, so there were regular reversing events to see "how far" you could get from scratch to zeroday within 48h hackathons. Those were fun times :D
[1] Skype, pre Microsoft rebrand of Lync as Skype
by cookiengineer
1/31/2026 at 8:59:29 PM
There's a whole section, early, in the analysis Albrecht posted that surfaces these concerns.by tptacek
1/31/2026 at 10:17:56 PM
Where in the document is that? Can you provide a page number or section title?by cookiengineer
2/1/2026 at 2:31:25 AM
^fby jibe
2/1/2026 at 7:29:04 PM
Of particular note here is that while compromised WhatsApp servers could add arbitrary members to a group, each member's client would show the new member's presence and would not share prior messages, only future messages.Now, of course, this assumes the client hasn't been simultaneously compromised to hide that. But it's defense in depth at the very least.
It is worth noting that this may be eroding as we speak: https://www.livemint.com/technology/tech-news/whatsapp-could... (Jan 24 2026) reports that Whatsapp is developing a way for one member to share historical messages en masse with a new group member. While this is manually triggered by the sender at the moment, it presents an enticing attack surface on technical, social-engineering, and political fronts to erode retroactive security much more rapidly going forward.
(And it goes without saying that if you think you're exempt from needing to worry about this because you're not involved in certain types of activity, the speed at which policies are evolving around the world, and the ability to rapidly process historical communications data at scale, should give you pause. "Ex post facto" is not a meaningful principle in the modern AI-enabled state.)
by btown
2/2/2026 at 3:28:40 PM
"People you send messages to have access to those messages. (And could therefore potentially share them with others.)" doesn't seem like a particularly scary security threat to me.by Ajedi32
2/2/2026 at 5:39:28 PM
The threat here is that the ability of an attacker to add themselves to a thread, stacked with a new ability to either socially-engineer or otherwise attack an existing member to click a single share-history button, could result in disclosure of history without explicit intent to share.by btown
2/2/2026 at 3:52:18 AM
"We didn't review the entire source code, ..."Why not
"Our work is based primarily on the WhatsApp web client, archived on 3rd May 2023, and version 6 of the WhatsApp security whitepaper [46]."
Did not even look at the continously changing mobile app, only looked at part of the minified Javascript in the web client
Not sure what this accomplishes. Are the encryption protocols used sound, is the implementation correct. Maybe, but the app is closed source and constantly changing
But users who care want to know about what connections the software makes, what is sent over those connections, to whom it is sent and why. There is no implicit trust as to Meta, only questions. The source code is hidden from public scrutiny
For example, the app tries to connect to {c,e8,e10,g}.whatsapp.net over TCP on port 80
The app has also tried to connect over UDP using port 3478/STUN
These connections can be blocked and the user will still be able to send and receive texts and make and receive calls
Meta forces users to install new mobile app, i.e., untrusted, unaudited code, multiple times per year. This install grows in size by over 100%
For example, there were at least four different apps (subsequent versions) forced on users in 2023, five in 2024 and four in 2025
In 2023 the first was 54.06MB. In 2026, it is now 126MB
by 1vuio0pswjnm7
1/31/2026 at 5:12:19 PM
Thank you for actually evaluating the technology as implemented instead of speculating wildly about what Facebook can do based on vibes.by some_furry
1/31/2026 at 7:56:45 PM
Unfortunately a lot of investigations start out as speculation/vibes before they turn into an actual evaluation. And getting past speculation/vibes can take a lot of effort and political/social/professional capital before even starting.by chaps
2/1/2026 at 4:45:10 AM
Well yeah. If they had solid evidence at the start, why would they need an investigation?by lazide
2/1/2026 at 5:40:13 PM
It's not as obvious of an answer as it initially sounds. Coming at this from a stint in investigative journalism where even beginning an investigation requires getting grants and grants involve convincing other people that the money is going to good use. Also having been told that an investigation I ran was nothing by multiple editors that turned out to be something big... it really shifted how I perceive investigations and what it means to stick your neck out when everyone's telling you that something isn't happening even when it is.by chaps
2/1/2026 at 4:05:15 PM
Vibes are a perfectly solid ground to refuse to engage with something.by afiori
1/31/2026 at 11:33:45 PM
Hello Professor Albrecht,thank you for your work.
I’ve been looking for this everywhere the past few days but I couldn’t find any official information relating the use of https://signal.org/docs/specifications/pqxdh/ in the signal protocol version that WhatsApp is currently using.
Do you have any information if the protocol version they currently use provides post-quantum forward secrecy and SPQR or are the current e2ee chats vulnerable to harvest now, decrypt later attacks?
Thanks for your time.
by Jamesbeam
1/31/2026 at 9:00:04 PM
They also decide what public key is associated with a phone number, right? Unless you verify in person.by morshu9001
1/31/2026 at 11:23:49 PM
That's protected cryptographically with key transparency. Anyone can check what the current published keys for a user are, and be sure they get the same value as any other user. Specifically, your wa client checks that these keys are the right key.by NoahZuniga
2/1/2026 at 1:18:14 AM
Even if your client is asking other clients to verify, what if everyone has the same wrong key for a particular user Whatsapp has chosen to spoof?by morshu9001
2/1/2026 at 12:02:53 PM
Well, surely your client knows what its own key is, and would notice that the listed key is wrong when it checks it.by NoahZuniga
2/1/2026 at 5:16:16 PM
They can also tell your client it has the correct key. Yours and the other clients are all talking to their mitm in this scenario. There's fundamentally no way to solve this without users verifying keys out-of-band.by morshu9001
2/2/2026 at 12:11:11 AM
> They can also tell your client it has the correct key.No they can't. Key transparency cryptographically makes sure everyone gets the same result.
by NoahZuniga
2/2/2026 at 8:04:32 PM
Key transparency is a public list of keys, like what CAs do. That still trusts an authority. Of course a third party could archive/republish the key list and you could trust them instead of Whatsapp, but that's what I call an out of band key verification.These are all good measures though. It's much harder for Whatsapp to mass attack users this way.
by morshu9001
2/3/2026 at 11:36:01 AM
Well, more than just that. For the published key transparency information to be trusted it has to not just be signed by WhatsApp, but also by an independent witness. In this case Cloudflare.So for wa to do a man in the middle attack they would also need to convince Cloudflare to sign two inconsistent tree heads.
by NoahZuniga
1/31/2026 at 9:52:31 PM
Can they control private keys and do replay attacks?by uoaei
1/31/2026 at 10:13:23 PM
Signal protocol prevents replay attacks as every message is encrypted with new key. Either it's next hash ratchet key, or next future secret key with new entropy mixed via next DH shared key.Private keys, probably not. WhatsApp is E2EE meaning your device generates the private key with OS's CSPRNG. (Like I also said above), exfiltration of signing keys might allow MITM but that's still possible to detect e.g. if you RE the client and spot the code that does it.
by maqp
2/1/2026 at 1:06:29 AM
Wouldn't ratchet keys prevent MITM too? In other words if MITM has your keys and decrypts your message, then your keys are out of sync from now on. Or do I misunderstand that?by TurdF3rguson
2/1/2026 at 10:12:56 AM
The ratchets would have different state yes. The MITM would mix in different entropy into the keys' states. It's only detectable if the MITM ever stops. But since the identity key exfiltration only needs to happen once per lifetime of installation (longer if key is backed up), the MITM could just continue forever since it's just a few cycles to run the protocol in the server. You can then choose whether to read the messages or just ignore them.One interesting way to detect this would be to observe sender's outgoing and recipient's incoming ciphertexts inside the client-to-server TLS that can be MITM'd by users. Since the ratchet state differs, so do the keys, and thus under same plaintext, so do the ciphertexts. That would be really easy way to detect MITM.
by maqp
2/3/2026 at 3:55:01 PM
Whatsapp didn't implement Signal's protocol verbatim. They appropriated the core cryptographic security and then re-implemented the rest on their own servers. This removes all guarantees of secrecy as long as they can run arbitrary code on the servers they own.by uoaei
1/31/2026 at 8:30:00 PM
> We didn't review the entire source codeThen it's not fully investigated. That should put any assessments to rest.
by digdigdag
1/31/2026 at 8:36:37 PM
By that standard, it can never be verified because what is running and what is reviewed could be different. Reviewing relevant elements is as meaningful as reviewing all the source code.by 3rodents
2/1/2026 at 7:37:29 AM
Let’s be real: the standard is “Do we trust Meta?”I don’t, and don’t see how it could possibly be construed to be logical to trust them.
I definitely trust a non-profit open source alternative a whole lot more. Perception can be different than reality but that’s what we’ve got to work with.
by dangus
1/31/2026 at 9:05:26 PM
Or they could even take out the backdoor code and then put it back in after review.by giancarlostoro
2/1/2026 at 12:31:38 AM
This is why signal supports reproducible builds.by hedora
2/1/2026 at 1:39:18 AM
In this day and age, in a world with Docker and dev containers and such, it's kind of shocking that reproducible builds aren't table stakes.by pdpi
2/1/2026 at 7:51:11 PM
Does it still require the gigantic binary blob?by LtWorf
1/31/2026 at 9:13:21 PM
Ah yes, the Volkswagen solution.by taneq
1/31/2026 at 9:23:47 PM
++1"target market product alignment" :-D
by KellyCriterion
1/31/2026 at 9:16:10 PM
I have to assume you have never worked on security cataloging of third party dependencies on a large code base.Because if you had, you would realize how ridiculous it is to state that app security can't be assessed until you have read 100% of the code
That's like saying "well, we don't know how many other houses in the city might be on fire, so we should let this one burn until we know for sure"
by ghurtado
2/1/2026 at 12:24:25 AM
What you are saying is empirically false. Change in a single line of executed code (sometimes even a single character!) can be the difference between a secure and non-secure system.This must mean that you have been paid not to understand these things. Or perhaps you would be punished at work if you internalized reality and spoke up. In either case, I don't think your personal emotional landscape should take precedence over things that have been proven and are trivial to demonstrate.
by fasbiner
2/1/2026 at 8:32:30 AM
> Change in a single line of executed code (sometimes even a single character!) can be the difference between a secure and non-secure system.This is kind of pointless, nobody is going to audit every single instruction in the Linux kernel or any complex software product.
by JasonADrury
1/31/2026 at 11:06:39 PM
It sounds like your salary has depended on believing things like a partial audit is worthwhile in the case that a client is the actual adversary.by jokersarewild
1/31/2026 at 11:54:41 PM
Except Meta is not an adversary. They are aligned with people who want private messaging.by charcircuit
2/1/2026 at 1:34:07 AM
Brutal sarcasm.by mikkupikku
1/31/2026 at 8:41:01 PM
as long as client side encryption has been audited, which to my understanding is the case, it doesn't matter. That is literally the point of encryption, communication across adversarial channels. Unless you think Facebook has broken the laws of mathematics it's impossible for them to decrypt the content of messages without the users private keys.by Barrin92
1/31/2026 at 8:50:11 PM
Well the thing is, the key exfiltration code would probably reside outside the TCB. Not particularly hard to have some function grab the signing keys, and send them to the server. Then you can impersonate as the user in MITM. That exfiltration is one-time and it's quite hard to recover from.I'd much rather not have blind faith on WhatsApp doing the right thing, and instead just use Signal so I can verify myself it's key management is doing only what it should.
Speculating over the correctness of E2EE implementation isn't productive, considering the metadata leak we know Meta takes full advantage of, is enough reason to stick proper platforms like Signal.
by maqp
1/31/2026 at 9:50:17 PM
> That exfiltration is one-time and it's quite hard to recover from.Not quite true with Signal's double ratchet though, right? Because keys are routinely getting rolled, you have to continuously exfiltrate the new keys.
by jcgl
1/31/2026 at 10:01:25 PM
No I said signing keys. If you're doing MITM all the time because there's no alternative path to route ciphertexts, you get to generate all those double-ratchet keys. And then you have a separate ratchet for the other peer in the opposite direction.Last time I checked, by default, WhatsApp features no fingerprint change warnings by default, so users will not even notice if you MITM them. The attack I described is for situations where the two users would enable non-blocking key change warnings and try to compare the fingerprints.
Not saying this attack happens by any means. Just that this is theoretically possible, and leaves the smallest trail. Which is why it helps that you can verify on Signal it's not exfiltrating your identity keys.
by maqp
2/1/2026 at 10:01:02 AM
Ah right, I didn't think about just outright MitMing from the get-go. If WhatsApp doesn't show the user anything about fingerprints, then yeah, that's a real hole.by jcgl
1/31/2026 at 9:06:54 PM
Not that I trust Facebook or anything but wouldn’t a motivated investigator be able to find this key exfiltration “function” or code by now? Unless there is some remote code execution flow going on.by subw00f
1/31/2026 at 9:49:09 PM
WhatsApp performs dynamic code loading from memory, GrapheneOS detects it when you open the app, and blocking this causes the app to crash during startup. So we know that static analysis of the APK is not giving us the whole picture of what actually executes.This DCL could be fetching some forward_to_NSA() function from a server and registering it to be called on every outgoing message. It would be trivial to hide in tcpdumps, best approach would be tracing with Frida and looking at syscalls to attempt to isolate what is actually being loaded, but it is also trivial for apps to detect they are being debugged and conditionally avoid loading the incriminating code in this instance. This code would only run in environments where the interested parties are sure there is no chance of detection, which is enough of the endpoints that even if you personally can set off the anti-tracing conditions without falling foul of whatever attestation Meta likely have going on, everyone you text will be participating unknowingly in the dragnet anyway.
by impure-aqua
1/31/2026 at 10:08:52 PM
"Many forms of dynamic code loading, especially those that use remote sources, violate Google Play policies and may lead to a suspension of your app from Google Play."https://developer.android.com/privacy-and-security/risks/dyn...
I wonder if that would deter Meta.
by maqp
2/1/2026 at 12:34:41 AM
Some apps have always been more equal than others.by monocasa
1/31/2026 at 10:20:04 PM
I don’t know these OS’s well enough. Can you MitM the dynamic code loads by adding a CA to the OS’s trusted list? I’ve done this in Python apps because there’s only 2 or 3 places that it might check to verify a TLS cert.by oofbey
1/31/2026 at 10:05:48 PM
>Not that I trust Facebook or anything but wouldn’t a motivated investigator be able to find this key exfiltration “function” or code by now?Yeah I'd imagine it would have been found by know. Then again, who knows when they'd add it, and if some future update removes it. Google isn't scanning every line for every version. I prefer to eliminate this kind of 5D-guesswork categorically, and just use FOSS messaging apps.
by maqp
1/31/2026 at 9:06:03 PM
The issue is what the client app does with the information after it is decrypted. As Snowden remarked after he released his trove, encryption works, and it's not like the NSA or anyone else has some super secret decoder ring. The problem is endpoint security is borderline atrocious and an obvious achilles heel - the information has to be decoded in order to display it to the end user, so that's a much easier attack vector than trying to break the encryption itself.So the point other commenters are making is that you can verify all you want that the encryption is robust and secure, but that doesn't mean the app can't just send a copy of the info to a server somewhere after it has been decoded.
by hn_throwaway_99