3/4/2026 at 9:27:03 AM
I think this is... fine? Am I just totally naive. I think it's fine to say "You don't really have privacy on this app" - as long as there are relatively good options of apps that do have privacy (and I think there are). TikTok is really a public by default type of social media, there's not much idea of mutual following or closed groups. So sure, you don't have privacy on tiktok, if you want it you can move to snapchat or signal or whatever platform of your choice.Like, it's literally a platform that was run under the watchful eye of the CCP, and now the US version is some kleptocratic nightmare, so I just don't see the point in expecting some sort of principled stance out of them.
In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. If you're going to embrace 'privacy' I do think it's on you to also then put additional resources into tackling the downsides of that.
by Traster
3/4/2026 at 9:53:55 AM
Tiktok has private messaging, and it is used by hundreds of millions of people.IMO no consumer service should have private 1:1 messaging without e2e. Either only do public messaging (ie. Like a forum), or implement e2e.
by londons_explore
3/4/2026 at 11:06:29 AM
Tiktok has direct messages, they don't even call them private.It's better that they're honest about this, nobody should believe for a second that WhatsApp or FB messages are truly E2EE.
DM on social media shouldn't be used for anything remotely private. It's a convenience feature, nothing more.
by RobotToaster
3/4/2026 at 7:08:12 PM
> nobody should believe for a second that WhatsApp or FB messages are truly E2EE.Meta still tracks analytics which isn't good for privacy, but I'm not aware of any news of them or 3rd parties reading messages without consent of one of the 1st parties? Signal is probably much better though
by dvngnt_
3/4/2026 at 9:34:39 PM
> Meta still tracks analytics which isn't good for privacy, but I'm not aware of any news of them or 3rd parties reading messages without consent of one of the 1st parties? Signal is probably much better thoughCorrect. WhatsApp uses the Signal protocol, and there is zero evidence of them reading message contents except with the consent of one of the users involved (such as a user reporting a message for moderation purposes).
(And before anyone takes issue with that last qualifier, consent from at least one party is the bar for secure communications on any platform, Signal included. If you don't trust the person you are communicating with, no amount of encryption will protect you).
Discovering a backdoor in WhatsApp for Facebook/Meta to read messages would be a career-defining finding for a security researcher, so it's not like this is some topic nobody has ever thought to investigate.
by chimeracoder
3/4/2026 at 7:10:21 PM
>I'm not aware of any news of themYet. Until they say "We delete these messages after X time and they are gone gone, and we're not reading them" Assume they are reading them, or will read them and the information just hasn't got out yet.
I mean we keep finding more and more cases where companies like FB and Google were reading messages years ago and it wasn't till now we found out.
by pixl97
3/4/2026 at 8:32:15 PM
> We delete these messages after X timeThey never had the plaintext of the messages in the first place, so they don't need to delete them. That's what end-to-end encrypted means.
by nindalf
3/4/2026 at 10:00:30 PM
Whether Facebook/Meta can read the plain text of the messages or not depends on whether that encryption is "zero knowledge" or not, aka: does Facebook generate and retain the private encryption key, or does it stay on the users' devices only, never visible to Facebook or stored on Facebook servers?In the former case, Facebook can decrypt the messages at will, and the e2ee only protects against hackers, not Facebook itself, nor against law enforcement, since if Facebook has the decryption key they can be legally compelled to hand it over (and probably would voluntarily, going by their history).
by hogwasher
3/5/2026 at 3:30:20 PM
They use the Signal protocol. The keys are not generated by Facebook, and are never on their servers. They are generated on the devices themselves.by Rubberducky1324
3/5/2026 at 12:23:20 AM
They don't need the plaintext if they have your key. Since they wrote the application you have zero clue if they do or not.by pixl97
3/4/2026 at 11:36:33 AM
> Tiktok has direct messages, they don't even call them private.It may not be called that, but what are users expecting? Some folks may later be surprised when a warrant gets issued (e.g., from a divorce judge).
by throw0101c
3/4/2026 at 11:49:34 AM
If you are a grown adult and dont do research on “messaging apps” (which Tik Tok is not) then thats really on you.by giancarlostoro
3/4/2026 at 4:02:10 PM
This viewpoint isn't a slippery slope, it's a runaway train."You moved into a neighborhood with lead pipes? That's on you, should have done more research" "Your vitamins contained undisclosed allergens? You're an adult, and it didn't say it DIDN'T contain those" "Passwords stolen because your provider stored them in plaintext? They never claimed to store them securely, so it's really on you"
by foobarchu
3/5/2026 at 1:26:50 PM
I once publicly stated it's understandable that someone would post an ad that says "No YouTubers" because people don't want to be content for others. The reply I got was "but you're being recorded all the time anyway", as if those are remotely related.by technofiend
3/4/2026 at 9:14:53 PM
Legislating that everyone must always be safe regardless of what app they use is a one-way ticket to walled gardens for everything. This kind of safety is the rationale behind things like secure boot, Apple's App Store, and remote attestation.Also consider what this means for open source. No hobbyist can ship an IM app if they don't go all the way and E2E encrypt (and security audit) the damn thing. The barriers of entry this creates are huge and very beneficial for the already powerful since they can afford to deal with this stuff from day one.
by AlexandrB
3/4/2026 at 11:34:54 PM
Doesn't have to be a law. Can just be standard engineering practice.Websockets for example are always encrypted (not e2e). That means anyone who implements a chess game over websockets gets encryption at no extra effort.
We just need e2e to be just as easy. For example maybe imagine a new type of unicode which is encrypted. Your application just deals with 'unicode' strings and the OS handles encryption and decryption for you, including if you send those strings over the network to others.
by londons_explore
3/4/2026 at 6:17:30 PM
this isn't anything new, however. No messaging has been actually private since forever, that's why encryption was invented. To keep secrets and to pass those secrets in a way that can be observed without revealing the secret.Telephones can be tapped, people sold special boxes that would encrypt/decrypt that audio before passing it to the phone or to the ear. Mail can be opened, covertly or not. AIM was in the clear (I think at one point, fully in the clear, later probably in the clear as far as the aol servers were concerned)...
Unless the app/method is directly lying to users about being e2ee it's not a slippery slope, it's the status quo. Now there are some apps out there that I think i've seen that are lying. They are claiming they are 'encrypted' but fail to clarify that it's only private on the wire, like the aim story.. the message is encrypted while it flys to the 'switchboard' where it's plain text and then it's put wrapped in encryption on the wire to send it to the recipient.
The claim here that actually makes me chuckle is somehow trying to paint e2ee as 'unsafe' for users.
by sleepybrett
3/4/2026 at 12:03:29 PM
If you are a grown adult and don't do research on "<insert any topic that could have a material negative impact on your life, but that is not currently on your radar as being a topic that could have a material negative impact on your life>" then that's really on you.Unfortunately, this doesn't scale.
by oarsinsync
3/4/2026 at 10:09:54 PM
It definitely ignores that many people don't have time. If someone is working over 40 hours per week, plus maybe doing unpaid labor taking care of kids or elders, where are people supposed to find the time and energy to brush up on a million different topics they don't even know they might not know enough about? Especially if they might also have medical issues, or hobbies, or want to have any time at all to relax.Obviously, one way to improve the situation would be to make sure people are paid fairly and not overworked and have access to good and affordable or free childcare and elder-care and medical care, but corporations don't want that either. If anything, they're incentivised to disempower workers and keep them uninformed, and to get as much time out of them as they can for as little money as possible.
by hogwasher
3/4/2026 at 12:15:10 PM
Well it does scale… just not in the way that is good for democracy.by wizardforhire
3/4/2026 at 4:52:28 PM
80% of the population does not and will never do that level of deep dive on appssame discussion for any form of technology be it TVs or changing their car's oil
the deliberate app-store-ification of all things computer is also designed to keep people from asking those questions -- just download in and install, pleb.
it's why the Zoomers can't email attachments or change file types: all of the computers they grew up with were designed so they never had to understand what happens under the hood.
by red-iron-pine
3/4/2026 at 7:13:59 PM
And I think because of all the handholding we are left worse off.by johnisgood
3/4/2026 at 10:17:17 PM
Most people couldn't tell you how their car works, at least not enough to fix it. Is that handholding, too?People can't be knowledgable about everything. There's just too much information in the world, and too many different skills that could be learned, and not enough time.
A carpenter can rely on power tools without understanding fully how the tools work, and it's fine, as long as the tools are made to safe standards and the user understands basic safety instructions (e.g. wear protective eyewear).
To me, making sure that apps don't screw with people, even if they don't understand how the apps work, is roughly the equivalent of making sure power drills are made safely so they don't explode in peoples' hands.
by hogwasher
3/5/2026 at 4:30:41 PM
“As long as the user understands basic safety instructions” Yes, the internet has basic safety instructions, too (and probably just as many bother to read them), number one or two is “almost nothing online is ever really private”. I learned it by the mid 2000s, not knowing it in 2026 is not excusable with “people don’t need to know how everything works”.by Leno1225
3/4/2026 at 11:46:36 PM
> Most people couldn't tell you how their car works […]Most people couldn't tell you how their furnace or water heater works, or flush toilet (siphonic effect).
by throw0101c
3/5/2026 at 4:17:54 PM
And I never said that people should be knowledgeable about everything...... and this is not what I was referring to either.
Less handholding -> more learning... but even then, what I meant is that you do not have to be knowledgeable to know that your "private messages" are not really encrypted and can be read by the admin (in case of forums, for one), and so forth.
by johnisgood
3/4/2026 at 9:07:31 PM
Way to dunk on OP I guess but nobody is playing semantics here, it's just whether people think this is a messaging channel with one intended recipientby s3p
3/4/2026 at 6:45:05 PM
Honestly I'm tired with every app trying to become the everything app.Now TikTok wants to be a messaging app. Snapchat has a short video feed just like TikTok. WhatsApp only has a text feed, how long until they also add a video feed?
by eloisant
3/4/2026 at 6:51:15 PM
Meta already has video feeds in facebook and instagram though, I imagine they wouldn’t want to detract users from thoseby vedaba
3/4/2026 at 11:37:35 AM
> nobody should believe for a second that WhatsApp or FB messages are truly E2EEThat's interesting. You think all firms that audited WhatsApp and Signal protocol used by WhatsApp and all programmers who worked there for decades and can see a lie and leak if it was true are all crooks? valid opinion I guess, but I won't call it "no one should believe for a second
(curious you didn't mention Telegram, it is actually marketed as secure and e2e and it has completely gimped "secret chats" that are off by default and used by like almost nobody.)
by throwaway290
3/4/2026 at 11:51:27 AM
I forget if its WhatsApp that technically lets you sync chats in unencrypted form to iCloud which is the “loophole” around this, though you can lockdown your iCloud even tighter, not sure it Apple can do much if you fully lock down your iCloud, not sure if this has been legally tested? Its not a very advertised feature its just a setting.by giancarlostoro
3/4/2026 at 12:04:53 PM
WhatsApp iPhone syncs to iCloud unencrypted by default[1].iMessage also syncs to iCloud unencrypted by default[2].
[1] Depends on you paying for iCloud storage, so that you have space for a full phone backup to occur.
[2] Might be "free" with "iMessage in iCloud", an option to enable separately.
by oarsinsync
3/4/2026 at 12:32:50 PM
> WhatsApp iPhone syncs to iCloud unencrypted by default[1].Not true. You must choose to enable it or not when you set up new phone. On mine it does not back up
by throwaway290
3/4/2026 at 1:55:43 PM
If you must "choose to enable" encryption, that implies it's off by default. If so, GP's statement is accurate.by monooso
3/4/2026 at 2:09:06 PM
Choose to enable backups.by simsla
3/4/2026 at 4:03:47 PM
No, I mean you must select yes or no. can't use WhatsApp until you make a choice yourself.by throwaway290
3/4/2026 at 12:00:30 PM
The Android version syncs all your chat logs to Google Drive without encryption by default. That's the backdoor.by gzread
3/4/2026 at 6:45:28 PM
iCloud backups are encrypted, and can be end-to-end encrypted.Also, backups have nothing to do with the messages being end-to-end encrypted. Like if you don't use a passcode on the phone, the messages are still encrypted.
by ianburrell
3/4/2026 at 11:56:05 AM
Right now it got a switch to enable e2e for backups, but yeah I think default backup is probably a workaround...by throwaway290
3/4/2026 at 5:53:22 PM
I'll believe it when it's FOSSby max-privatevoid
3/4/2026 at 7:36:19 PM
You mean you will read all code with dependencies and compile it yourself to make sure?;) good for you. but good luck creating a popular e2e messenger then.by throwaway290
3/4/2026 at 10:38:19 AM
In my experience most forums have private messaging.Additionally I think it is fine to say "we don't support e2ee". I prefer honesty to a bad (leaky) e2ee implementation, at least the user can make an informed choice.
by trashb
3/4/2026 at 10:48:53 AM
I agree. At least take of "Yes messages are stored on our servers" is honest. And if they are accessed by anything else than limited subpoena is policy or legal issue.by Ekaros
3/4/2026 at 12:38:41 PM
>In my experience most forums have private messaging.Yeah but it's kind of accepted that the forum owner could read it all if they so chose. Maybe this is a hold over from back in the old days when encryption was nowhere near default during which forums arose.
by cucumber3732842
3/5/2026 at 12:38:40 PM
I don't disagree with providing people with more privacy, but what you present is a false dichotomy.For a long time we lived with private messages over SMS that were easily readable by third parties.
by DeusExMachina
3/5/2026 at 6:00:34 AM
Lots of other consumer services such as Strava have direct messaging without e2e encryption. No privacy is guaranteed. This is fine, they're not deceiving anyone about how it works.by nradov
3/4/2026 at 3:18:17 PM
Adding that private self hosted forums can permit uploads of encrypted files, encrypted with a pre-shared secret or a secret shared over a private self hosted Mumble voice chat server.by Bender
3/4/2026 at 1:05:27 PM
And yet virtually all consumer services with 1:1 messaging lacks e2e. This is a bit of a quixotic position to take.by DoneWithAllThat
3/4/2026 at 11:22:49 AM
The email protocols would like to have a chat with you.by tuwtuwtuwtuw
3/4/2026 at 11:27:37 AM
You can bring your own encryption to that, and bring your own client to automate it.by kgwxd
3/4/2026 at 12:55:24 PM
you can encrypt the content but not the metadata, not even the subject unless you use a customized client that encodes it (like deltachat which doesn't use a subject at all), but then you still have your email address exposed.for all intents and purposes email is not e2ee.
by em-bee
3/4/2026 at 3:10:17 PM
Email encryption for most people is sufficient even if the metadata is exposed. One can simply state in their email encryption "Bing Bing Bong" or "Why did you not put the trash out?" which might mean to the recipient :: "check the second SFTP server" or "let the cat outside" or "Jump on my private Mumble chat server" or "Get on my private self hosted IRC server". The email message need not be encrypted for that matter.The intended payload can be in an header-less encrypted file on a throw-away SFTP server in the tmpfs ram disk.
by Bender
3/4/2026 at 8:07:31 PM
So it's end to end encrypted except that third parties can see who you communicated with and when? Sure.by tuwtuwtuwtuw
3/4/2026 at 8:27:38 PM
I have never considered metadata a part of the term E2EE. It has always been about the message contents.I understand that metadata is valuable information for spies/governments and that encrypting or hiding it is valuable for privacy. But if you use that definition, there are almost no E2EE protocols on the planet in use.
First and foremost, any protocol that uses Apple or Google push notifications is giving metadata to those organizations. Even Whatsapp, iMessage, Signal, Telegram private messages, all of that leaks metadata but the contents of messages are hidden from the provider.
by unethical_ban
3/4/2026 at 8:48:01 PM
Exactly.by Bender
3/4/2026 at 8:08:56 PM
yeah bro genius, that sounds like a totally actionable thing people will do all the time with email. Be sure to drink your ovaltineby beeflet
3/4/2026 at 8:47:50 PM
yeah bro geniusI know, right? I admit that is mostly for people on Linux desktops. People on smart phones are 100% monitored regardless of encryption or fake E2EE that platforms pinky promise is really E2EE like Signal. Shame on Moxie, he knows better.
Ovaltine has a crapload of sugar. Don't drink that horse piss.
by Bender
3/4/2026 at 8:06:16 PM
I can bring my own encryption to tiktok as well. Has roughly the same usability and usage.by tuwtuwtuwtuw
3/4/2026 at 6:20:55 PM
you can bring your own encryption to ANY messaging platform, doesn't mean it will be easy to use. e2ee just really makes it handy so that users don't need to preshare any keys.by sleepybrett
3/4/2026 at 5:42:55 PM
> as long as there are relatively good options of apps that do have privacy (and I think there are)Once you have enormous network effect like TikTok has, you don't really have any free selection of alternative apps. You are free to use one, but you will be the only sad user over there.
Regulations are needed that would force large platforms like TikTok and Instagram to enable federation, opening them up to actual competition. This way platforms would be able to compete on monetisation and usability, instead of competing on locking in their precious users more strictly.
by smugglerFlynn
3/4/2026 at 7:56:46 PM
“Will we ever end the MySpace monopoly?”> MySpace is well on the way to becoming what economists call a "natural monopoly". Users have invested so much social capital in putting up data about themselves it is not worth their changing sites, especially since every new user that MySpace attracts adds to its value as a network of interacting people.
> "In social networking, there is a huge advantage to have scale. You can find almost anyone on MySpace and the more time that has been invested in the site, the more locked in people are".
https://www.theguardian.com/technology/2007/feb/08/business....
by acheron
3/4/2026 at 10:27:11 PM
Sure, but then everyone moved to Facebook. The monopolist changed, but not the monopolistic market and the lack of consumer choice.And nobody gained privacy in the process (I rather think everyone lost even more of it).
The situation currently permits only a tiny number of winning companies at a time, and the userbase is locked in even as the site becomes wildly unpopular, until some threshold of discontent is reached, and then everyone moves, and then that new site also enshittifies and the cycle repeats.
Federation is a mechanism whereby people would be able to actually choose providers as individuals and at any time, instead of having to wait years for a critical mass of upset people to build up and leave [current most popular social media site], and instead of being forced to go to [new most popular social media site].
by hogwasher
3/4/2026 at 7:08:40 PM
>Regulations are neededLolololol. No, not regulations. Regulators. With the people we currently have voted into office in the US the only regulations we are going to get are ones saying Sam and Peter must look at everything you do all the time.
Until we stop voting for more authoritarianism, expect ever increasing amounts of authoritarianism.
by pixl97
3/5/2026 at 12:06:07 AM
I would argue the only thing that does stop current situation from snowballing into something much worse are pre-existing institutions and regulations.That's also why dismantling and challenging these is often the very first priority for authoritarian actors.
by smugglerFlynn
3/4/2026 at 10:28:20 PM
I think it was clear what they meant.by hogwasher
3/4/2026 at 8:03:27 PM
federation would never work. How would it work here? Either you are forcing tiktok to give pageviews to federations of spam, or you are letting tiktok decide which federations to work with, which essentially results in no federation.by beeflet
3/4/2026 at 11:43:27 PM
Nobody stops spammers from creating websites, but we still have search engines and web. Nobody stops spammers from sending emails, but we still use SMTP.It is just a matter of tools we build to rank and filter content. With open protocols platforms can actually compete on antispam tools, among other features.
by smugglerFlynn
3/4/2026 at 12:53:44 PM
It might be fine if they presented an honest choice.They are lying straight off though... police and safety team don't read messages only "if they needed to" to keep people safe. They do so for a large variety of other reasons, such as suppressing political dissent and asserting domination and control.
I don't think we can expect most people to understand TikTok's BS here either. I notice even a skeptic like you is uncritically echoing the dubious conflation of privacy and CSAM.
by jmull
3/4/2026 at 1:47:13 PM
Anyone who doubts the requirement for e2e messaging should not be considered a skeptic, they are fully buying into whatever narrative LEO would like you to believe.by hobs
3/5/2026 at 4:11:55 AM
It's fine except for their argument that it makes people less safe. If they want to disallow encryption they don't need to lie to people while they're at it.by LZ_Khan
3/4/2026 at 10:18:37 AM
No, saying that e2e encryption makes users _less_ safe is completely dishonest, nothing is fine about this.The logic of "anything is better than before" is also fallacious.
by khalic
3/4/2026 at 10:21:46 AM
Depends on your definition of "safe". Imagine an adult DMs a nude photo to a minor (or other kinds of predation).If it's E2EE, no one except the sender and receiver know about this conversation. You want an MITM in this case to detect/block such things or at least keep record of what's going on for a subpoena.
I agree that every messaging platform in the world shouldn't be MITM'd, but every messaging platform doesn't need to be E2EE'd either.
by roncesvalles
3/4/2026 at 10:34:03 AM
The receiver has a proven and signed bundle, that they can upload to the abuse report. So the evidence has even stronger weight. They can already decrypt the message, they can still report it.by shakna
3/4/2026 at 10:41:47 AM
Yes, but this leaves the only way to identify this behavior as by reporting from a minor. I'm not saying I trust TikTok to only do good things with access to DMs, but I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted.I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.
by michaelmior
3/4/2026 at 11:03:28 AM
> I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encryptedWould it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant? People are paranoid about this sort of thing not because they think law enforcement is more effective when it is constrained. But how easily crimes can be prosecuted is only one dimension of safety.
> However, an alternative could be allowing the sharing of the encryption key with a parent
Right, but this is worlds apart from "sharing the encryption key with a private company", is it not?
by danlitt
3/5/2026 at 12:54:32 AM
> Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant?This is a false equivalency. I don't have to use TikTok DMs if I want E2EE. I don't have a choice about laws that allow the police to violate my rights. I'm not claiming that all E2EE apps should be banned.
> Right, but this is worlds apart from "sharing the encryption key with a private company", is it not?
Exactly why I suggested that as a possible alternative.
by michaelmior
3/5/2026 at 9:50:31 AM
> This is a false equivalency.I'm not making an equivalency. I'm just trying to get you to think how something that is at surface level true is not necessarily a "fair argument".
> I don't have to use TikTok DMs if I want E2EE.
I don't know why you think this is a convincing argument. It is currently illegal to tap people's phone lines, but when phones were invented it obviously was not illegal. It became illegal in part because people had a reasonable expectation of privacy when using the phone. They also have a reasonable expectation of privacy when using TikTok DMs - that's why people call them "private messages" so often!
> Exactly why I suggested that as a possible alternative.
My point is that you are offering these as alternatives when they are profoundly different proposals. It is like me saying I am pro forced sterilization and then offering as an alternative "we could just only allow it when people ask for it". That's a completely different thing! Having autonomy over your online life as a family rather than necessarily as an individual is totally ok. Surrendering that autonomy is not.
by danlitt
3/4/2026 at 11:30:48 AM
> Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant?Police can access your home with a warrant.
Police cannot access your E2EE DMs with a warrant.
by InsomniacL
3/4/2026 at 4:57:06 PM
Not answering my question!> Police cannot access your E2EE DMs with a warrant.
They can and do, regularly. What they can't do is prevent you from deleting your DMs if you know you're under investigation and likely to be caught. But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.
They also can't prevent you from flushing drugs down the toilet, but somehow people are still convicted for drug-related crimes all the time. So - yes, obviously, the police could prosecute more crimes if we gave up this protection. That's how limitations on police power work.
by danlitt
3/4/2026 at 11:35:58 PM
> What they can't do is prevent you from deleting your DMs if you know you're under investigation and likely to be caughtIf you are pretty confident your under investigation then this is might be Obstruction of Justice and that's pretty illegal.
by NoahZuniga
3/4/2026 at 9:07:05 PM
> But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.Uh, it absolutely isn't? WTF dystopian idea is this?
by Tadpole9181
3/5/2026 at 9:43:21 AM
It certainly can be - destruction of evidence is a crime. If they can prove you destroyed evidence, even if they can't prove that the destroyed evidence incriminates you, that's criminal behaviour. For instance if it's known by some other means you have a conversation history with person X, but not whether that conversation history is incriminating, and then when your phone is searched the conversation history is completely missing, that is strong evidence of a crime.by danlitt
3/4/2026 at 12:34:50 PM
And they shouldn't be able to. Police accessing DMs is more like "listening to every conversation you ever had in your house (and outside)" than "entering your house".by allreduce
3/4/2026 at 12:35:46 PM
>Police cannot access your E2EE DMs with a warrant.Well the kind of can if they nab your cell phone or other device that has a valid access token.
I think it's kind of analogous to the police getting at one's safe. You might have removed the contents before they got there but that's your prerogative.
I think this results in acceptable tradeoffs.
by cucumber3732842
3/4/2026 at 12:02:09 PM
Yes, that is a fair argument and most countries allow the use of surveillance cameras in public for this reason.by gzread
3/5/2026 at 9:54:40 AM
in public is the operative word (and surveillance cameras in public are extremely recent and very controversial, so not as strong an argument as you might be thinking)by danlitt
3/5/2026 at 3:04:39 AM
> I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.The problem with that idea, that you are implying E2E should require age verification. Everyone should have access to secure end to end encryption.
by EmbarrassedHelp
3/4/2026 at 10:42:03 PM
Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor? Because no matter how you do that, that would result in false positives, and either unfair auto-bans and erroneous reports to law enforcement (so no human views the images), or human employees viewing other adults' consensual nudes that were meant to be private. Or it would result in adult employees viewing nudes sent from one minor to another minor, which would also be a major breach of those minors' privacy.There is a program whereby police can generate hashes based on CSAM images, and then those hashes can be automatically compared against the hashes of uploaded photos on websites, so as to identify known CSAM images without any investigator having to actually view the CSAM and further infringe on the victim's privacy. But that only works vs. already known images, and can be done automatically whenever an image is uploaded, prior to encryption. The encryption doesn't prevent it.
Point being, disallowing encryption sacrifices a lot, while potentially not even being that useful for catching child abusers in practice.
I'm sure some offenders could be caught this way, but it would also cause so many problems itself.
by hogwasher
3/5/2026 at 12:51:14 AM
> Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor?No, I was not suggesting that.
by michaelmior
3/4/2026 at 12:01:22 PM
SimpleX handles this by sending the decryption keys when the receiver reports the message.by gzread
3/4/2026 at 10:39:07 AM
Keeping children safe and prosecuting are too different concepts, only vaguely related. So no, being able to track pdfs doesn't make children safer. What keeps them safe is teaching them safe communication habits and keeping them away from things like Tiktok.We shouldn't make the world a worse place for every one because some parents can't take care of their children.
by khalic
3/4/2026 at 12:37:13 PM
>Keeping children safe and prosecuting are too different concepts, only vaguely related.See also: That time the FBI took over a CSAM site and kept it running so they could nab a bunch of users.
by cucumber3732842
3/4/2026 at 2:15:54 PM
Not necessarily saying what they did was right, but I think there's a strong utilitarian argument to be made that what they did in that case was, in fact, the best way to keep children safe.What's more dangerous? CSAM on the internet? Or actual child predators running loose?
by Ajedi32
3/4/2026 at 2:49:41 PM
That stuff spreads and re-spreads just like anything else people download off the internet. There's a pretty strong argument for shutting it down right away. IIRC most users were outside jurisdiction.by cucumber3732842
3/4/2026 at 3:42:44 PM
Even if one more person was prosecuted it was worth it. If you shut down an illegal website a new one will show up a month later, with the same people involved, and you achieved nothing.by integralid
3/4/2026 at 5:32:18 PM
What was the rate of child exploitation in the GDR?by roughly
3/4/2026 at 11:33:18 AM
Ugh. The kids aren't even safe from the people making, and enforcing laws. This argument should be long over for anyone with eyes or ears.by kgwxd
3/4/2026 at 10:29:21 AM
Imagine Hamas are your government and want to figure out who's gay. You don't want a MITM in case they can do this.Pick your definition of safe.
by philipallstar
3/4/2026 at 10:46:41 AM
In that case don't use Tiktok dm's to discuss your sexuality. I think it is strange that people feel like they have to be able to talk on sensitive topics over every interface they can get their hands on.Similarly in "traditional" media you may not want to discuss such private conversation on a radio broadcast. Perhaps you would rather discuss it on the phone or over snail mail as there is more of an expectation of privacy on those medium.
by trashb
3/4/2026 at 5:34:11 PM
Right, but it currently isn't a sensitive topic - homosexuality is, as of 2026, broadly legal in the United States. That's a relatively new state of affairs, historically speaking, and one which Afghanistan shared as recently as 2021.by roughly
3/4/2026 at 12:56:12 PM
I'm commenting in the context of the conversation, not in a vacuum. You could just as (in fact, much more) easily say that children shouldn't be on apps with private messaging enabled. That would help a lot more, and then we could keep e2ee.by philipallstar
3/4/2026 at 10:56:18 AM
> there is more of an expectation of privacy on those mediumWhat does the "p" in "pm" stand for?
by danlitt
3/4/2026 at 11:39:54 AM
excuse me, I confused "Private messages" (pm) for "Direct messages" (dm).I will update above
by trashb
3/4/2026 at 4:52:01 PM
I don't think you confused anything, except for the terminology the platform uses. There is an obvious expectation of privacy when sending direct messages!by danlitt
3/4/2026 at 6:24:03 PM
Hasn't been true ANYTIME IN HISTORY. Hell it was well understood even by children that no conversation you had on the telephone was truly private. That's why cyphers were invented.by sleepybrett
3/5/2026 at 9:53:06 AM
What are you talking about? It is illegal to tap people's phone lines or to interfere with mail. Are you saying people don't have a reasonable expectation of privacy even when it's illegal to be spied on?by danlitt
3/5/2026 at 4:55:55 PM
'Illegal' doesn't really mean anything in this, or any other, day and age when you are talking about the very rich, the very powerful, or the state.The good thing about e2ee is that it probably makes the list of those with the ability to decrypt things encrypted e2e somewhat smaller. Fact is hacking can get to those keys. (i.e. state actor zero-click exploits your phone they are going to be able to get your private key and the messages in memory)
by sleepybrett
3/5/2026 at 5:41:19 PM
> 'Illegal' doesn't really mean anything in thisThis is a thread arguing about what the law should be.
> Fact is hacking can get to those keys.
Everything made by humans is fallible.
by danlitt
3/4/2026 at 12:02:44 PM
it stands for "not a public timeline post"by gzread
3/4/2026 at 4:52:43 PM
It should be obvious from how contrived your wording is that nobody thinks of them this way.by danlitt
3/4/2026 at 10:56:43 AM
This is fine if you have TLS encryption and the platform is not local.Sure, they can fabricate some evidence and get access to your messages, in which case, valid point.
by miki123211
3/5/2026 at 3:21:27 AM
It's a kind of Trojan horse propaganda in my opinion.Users get used to the argument with TikTok and then apply it to other platforms.
Put it this way: why wouldn't those same arguments apply to any platform (if you believed them)?
by derbOac
3/4/2026 at 11:24:23 AM
well having no e2e encryption is safer than having a half-baked e2e encryption that have backdoor and can be decrypted by the provider.and for tiktok's stance, I think they just don't want to get involved with the Chinese government related with encryption (and give false sense of privacy to user)
by fendy3002
3/4/2026 at 10:55:24 AM
It makes certain users less safe in certain situations.E2E makes political activists and anti-chinese dissidents safer, at the cost of making children less safe. Whether this is a worthwhile tradeoff is a political, not technical decision, but if we claim that there are any absolutes here, we just make sure that we'll never be taken seriously by anybody who matters.
by miki123211
3/4/2026 at 11:50:59 AM
Claiming e2e makes children less safe is flat out dishonest. And the irony of you criticising “absolutes” after trying to pass one is just delicious.by khalic
3/4/2026 at 12:03:25 PM
What are children at risk of, when E2EE is used?What are children at risk of, when E2EE is not used?
by gzread
3/4/2026 at 5:34:59 PM
> What are children at risk of, when E2EE is used?Potential exposure to abusive adults.
> What are children at risk of, when E2EE is not used?
State-sanctioned violence.
by roughly
3/4/2026 at 12:57:55 PM
This is the argument they can’t have…by reactordev
3/4/2026 at 9:48:25 PM
I am fine TikTok remaining that 'we watch what you are doing' platforms. Those do not care can gave that if they wish, I do not mind.But bullshitting about it is making users more safe, that is ... bullshit! Worse that that, distorting public opinion, intentionally fooling the gullible.
by mihaaly
3/4/2026 at 7:18:05 PM
Fine with me too. I think many other apps (WhatsApp, FB, etc.) are using E2EE for PR purposes and are not actually good implementations of E2EE.Good implementations of E2EE:
1. Generate the key pairs on device, and the private key is never seen by the server nor accessible via any server push triggered code.
2. If an encrypted form of the private key is sent to the server for convenience, it needs to be encrypted with a password with enough bits of entropy to prevent people who have access to the server from being able to brute force decode it.
3. Have an open-source implementation of the client app facilitating verifiability of (1) and (2)
4. Permit the users to self-compile and use the open-source implementation
If company isn't willing to do this, I'd rather they not call it E2EE and dupe the public into thinking they're safe from bad actors.
by dheera
3/4/2026 at 1:55:42 PM
>I think it's fine to say "You don't really have privacy on this app"Disagree. To analogize why: privacy isn't heated seats, *its seat belts*. Comfort features and preferences are fine to tailor to your customers and your business model. Jaguar targets a different market than Ford, and that's just fine.
Safety features should be non-negotiable for all. Both Jaguar and Ford drivers merit the utmost protection against injury in crashes. Likewise, all applications that offer user messaging functionality should offer non-defective, non-harmful versions of it. To do that, e2e privacy is absolutely necessary.
>I just don't see the point in expecting some sort of principled stance out of them.
This is the defeatism that adds momentum to a downhill trajectory. Exactly the opposite approach arrests the slide - users expecting their applications and providers to behave in principled ways, and punishing those who do not, are what keeps principles alive. Failing to expect lawful and upright behavior out of those you depend on, be they political leaders or software solutions providers, guarantees that tomorrow's behavior will be less lawful and upright than yesterday's. Stop writing these people a pass for this horrible behavior, and start holding them unreasonably accountable for it, then we'll see behavior start to change in the direction that we mostly all agree that it needs to.
The most effective protests against internet censorship came from massive grass roots movements, with users drawing a line in the sand that they will not tolerate further impositions on their freedom.
>In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform.
The irony is so manifest of billions of people having their privacy stripped by politicians and business elites in the name of protecting our children, while those politicians and business elites conspire en masse to prey on and sex traffick our children. If these forces actually took those concerns seriously, rather than sensing them as an opportunity to push ulterior motives, they'd be eating each other alive, right now. Half of DC, half of Hollywood, and at least a tenth of most major college administrations would ALL be at the docket.
by mrexcess
3/4/2026 at 2:16:05 PM
Tesla doesn't have parking sensors. They're a safety feature. There's lots of safety features in cars that are optional, we've got an entire rating system for the safety of cars.We're talking about an app that's controlled by the CCP, I do expect them to take a principled stance - stances like Taiwan is a part of China and you can't be openly critical of the leader of the party. They don't have the same principles as you. You can force them to put in E2EE, but you can't force them to be honest about it or competent about it. I would rather know what we're getting than to push them to lie.
This is the same thing as the OpenAI/Anthropic thing. You've got Anthropic taking a principled stance and getting pain for it, and you've got OpenAI claiming to take the same stance, but somehow agreeing to the terms of the DoW. Do you think it's more likely that Anthropic carelessly caused themselves massive trouble, or do you think OpenAI is claiming to have got the concessions that clearly won't work in practice. I think it's naive to think the former.
by Traster
3/4/2026 at 4:05:57 PM
>We're talking about an app that's controlled by the CCP, I do expect them to take a principled stanceIn the area of large scale internet service providers, who do you expect to take a principled stance, and why do you expect them to take it?
If the answer is, "nobody", then why keep singling out China? And if the answer isn't "nobody", then how do we apply the same pressures and principles to TikTok and other platforms that offer messaging?
This isn't some abstract concern. We know that WESTERN journalists, activists, and others have been murdered in acts of transnational repression that either began or were focused and abetted by communications surveillance aimed toward political dissidence. It seems incredibly naive to believe that current Western political and military leadership could ever be dissuaded from taking effective action (and such surveillance and repression campaigns certainly are effective) by moral qualms unsupported by strong checks and balances of accountability. In other words - this sort of repression most likely continues happening to journalists, activists, human rights lawyers, and other political dissidents, in our society, today. Enabled by the refusal of our service providers to protect us, their users.
It seems incredibly naive - civilization threateningly so - to write a pass to anyone, let alone Larry Ellison, for opting to deliberately expose "his" users to this risk. Nothing is OK about this dereliction of responsibility towards them.
by mrexcess
3/4/2026 at 1:48:00 PM
Trying to gaslight the public into thinking end to end encryption makes users less safe is not fine.by dfxm12
3/4/2026 at 9:48:38 PM
That it’s fine because it’s the CCP (commies see all) is a new one.It’s at best subpar for the same reasons as if it was the usual Silicon Valley spyware.
I could leave well enough alone. But why? Because there are choices? There are five other brands of cereal that do not have 25% sugar? I’d rather be a negative nancy towards these on-purpose addictive, privacy-leaking attention pimp apps.
by keybored