3/17/2026 at 8:53:25 PM
I feel like this is general knowledge for the past 5 or so years, but the real question is "What do we do about it?". Personally, I put real effort into not spending time being outraged online, but this is a societal ill that's bigger then I am...by bigfishrunning
3/17/2026 at 9:13:33 PM
"What do we do about it?"Shut down the behavior with regulations or shut down the companies. Meta and TikTok have no natural right to exist if they are a net negative to society.
by munk-a
3/17/2026 at 10:49:48 PM
Specifically, I believe Section 230 protections shouldn't apply to algorithmicly promoted content. TikTok hosting my video isn't inherently an endorsement of what I'm saying, but proactively pushing that video to people is functionally equivalent even if you want to quible over dictionary definitions. These algorithms take these platforms from dumb content-agnostic pipes that deserve protections to editorial enterprises that should bear responsibility for what they promote.by slg
3/17/2026 at 11:39:59 PM
There is a decent legal argument to be made that §230 doesn't immunize platforms for the speech of their algorithm, to the extent that said speech is different from the speech of the underlying content. (A simple, if absurd, example of this would be if I ran a web forum and then created a highlight page of all of the defamatory comments people posted, then I'm probably liable for defamation.)The problem of course is that it's difficult to disentangle the speech of algorithmic moderation from the speech of the content being moderated. And the minor issue that the vast majority of things people complain about is just plain First Amendment-protected speech, so it's not like the §230 protections actually matter as the content isn't illegal in the first place.
by jcranmer
3/18/2026 at 4:03:50 AM
I don't think we even need to go that far. Just remove protection for paid advertisements. It's absurd that Meta cannot be held liable for the ads they promote when a newspaper can be held liable if they were to publish the same ad.by robhlt
3/18/2026 at 10:22:04 AM
But isn't this difficult when the tech bosses are in cahoots with the country bosses? And honestly even if the leadership changes, I somehow have a feeling the techs will naturally switch boats as well - might be a reason why the opposition doesn't paint them that much nowadays, to make sure they switch along.by soco
3/18/2026 at 12:56:04 PM
They were all staunch Democrats with pro-censor stances until 14 months ago, and for a long long time.by philipallstar
3/17/2026 at 10:51:07 PM
How would you square that with a site like Hacker News, which has algorithms for showing user-submitted links and user-generated comments?by Aurornis
3/17/2026 at 10:56:14 PM
Listing content alphabetically or chronologically is technically an "algorithm" too. What I'm specifically challenging here is the personalized algorithm designed to keep individual users on the platform based off a user profile influenced by countless active and passive choices the user has made over time. The type of HN algorithm that serves the same content to every user based off global behavior is fine in my book because it is both less exploitative of the user base and a reflection of that user base's proactive decisions in upvoting/downvoting content.by slg
3/17/2026 at 11:14:35 PM
So if HN added anything personalized, like allowing you to show fewer stories on topics you dislike, it would lose protection? I can't get on board with that.I also think it would be extremely unpopular. People like their recommendation engines. They want Netflix to show them more similar shows. They want Reddit to help them find more similar subreddits. I know there are HN users who don't want any of these recommendation engines, but on the whole people actually want them.
by Aurornis
3/17/2026 at 11:42:21 PM
>People like their recommendation engines.People liked cigarettes too.
>They want Netflix to show them more similar shows.
Perhaps that example was a little too revealing on your end. Netflix doesn't have/need Section 230 protections and they're doing fine.
I'm not suggesting these algorithms should be illegal, just that Section 230 protections were defined too broadly because they predated the feasibility of these type of algorithms. These platforms would be free to continue algorithmic promotion, but I believe these algorithms would be less harmful if the platforms had to worry about potential legal liability.
Think YouTube and copyright for comparison. The DMCA is far from perfect, but we have YouTube as an example of a platform that survived and even thrived in the transition from a world that didn't care about copyrighted internet video to one in which they that needed to moderate with copyright in mind.
by slg
3/17/2026 at 11:50:47 PM
> People liked cigarattes too.Cigarettes weren’t made illegal. Cigarette companies are not liable for their user’s choice to consume them. What’s your point?
> Perhaps that example was a little too revealing on your end. Netflix doesn't have/need Section 230 protections and they're doing fine.
Perhaps it was a little too revealing on your end that you conveniently ignored my other example of Reddit.
If you need to cherry pick to make your point it doesn’t look very strong.
I still don’t see consistency in your argument that Section 230 should still apply to Hacker News but not, for example, Reddit, simply because one of them allows users to personalize the content they see.
by Aurornis
3/18/2026 at 1:41:47 AM
> Cigarette companies are not liable for their user’s choice to consume them.They kind of were. Not completely liable, but partially. Because... um, well, uh, yeah, they are. They are literally liable.
If you produce cigarettes, you are partially responsible for people smoking. Smoking is also not a "choice", come on now. The only people who believe that are people trying to sell you cigarettes or people who have never smoked.
That's why you can't advertise cigarettes anywhere anymore and they're very hard to find. And, when you do find them, the box tells you "hey please don't smoke this". R.J. Reynolds didn't do that by fucking choice, we forced them.
by array_key_first
3/18/2026 at 1:46:08 AM
> They kind of were. Not completely liable, but partially. Because... um, well, uh, yeah, they are. They are literally liable.Cigarette companies are not legally liable for the consequences their users encounter.
It’s really hard to have an actual discussion about anything when people are just making up their own definitions.
by Aurornis
3/18/2026 at 1:53:24 AM
Cigarette companies paid billions, and continue to pay, for the societal harm they cause. That's a liability. They're not legally liable in the sense that nobody is going to jail. But they have financial liabilities. Because they do, literally, cause financial harm.I don't think people really understand just how harshly we ran Tobacco companies into the ground. Many pay more per cigarette for liability than they pay to make the cigarette.
by array_key_first
3/18/2026 at 5:04:23 AM
In the narrow definition of the term you are using, cigarette companies were found legally liable.The whole reason they got sued and regulated was because they hid the fact that they knew their product was causing cancer in its users.
There’s additional regulation on cigarettes, which also includes higher taxes on its sale.
We regularly put limits on industries which create externalities that have to be borne by the exchequer.
by intended
3/18/2026 at 3:36:54 AM
> Cigarette companies are not legally liable for the consequences their users encounter.Ok! But they do have to follow a bunch of extra laws that cost them a ton of money and/or users.
Therefore the same can apply to social media algorithm companies.
The one extreme example, is just like cigarettes, there could be 18+ age verification for social media. There a big deal.
by stale2002
3/18/2026 at 12:01:07 AM
This is the type of comment that suggests you aren't engaging with what I'm saying beyond a superficial level. My argument is consistent. I'm not cherry-picking examples. The differentiator I'm criticizing is the personalized nature of the algorithms. But rather than engaging with the merit of that distinction, you're acting as if there is no distinction at all. I'm not sure if there is much point in contuning the conversation from there.by slg
3/18/2026 at 12:14:00 AM
I think the other person's issue with your position is that the distinction is entirely arbitrary. You're not giving any reasons why the demarcation line for which feed algorithms are OK and which are not is there instead of anywhere else. It seems to be just "Facebook and TikTok are bad; Their feeds are personalized recommendation engines; Therefore personalized recommendation engines are bad, and other feed algorithms are OK".by fluoridation
3/18/2026 at 12:40:45 AM
>I think the other person's issue with your position is that the distinction is entirely arbitrary.Basically all laws related to speech are abitrary. Can you define a clear and self-evident line between pornography and art as an example? Or do you agree with the Supreme Court that we just "know it when [we] see it"?
>You're not giving any reasons why the demarcation line for which feed algorithms are OK and which are not is there instead of anywhere else.
Let me just copy and paste what I said before: "The type of HN algorithm that serves the same content to every user based off global behavior is fine in my book because it is both less exploitative of the user base and a reflection of that user base's proactive decisions in upvoting/downvoting content." I can understand if one of you want to challenge that line of thought, but you both acting like I didn't give any reasoning at all is bizarre and gives me the impression that you aren't actually reading what I'm writing.
by slg
3/18/2026 at 3:40:28 AM
> Basically all laws related to speech are abitrary.True. This is a fair point. But the expected counter argument would be that the exact line isn't the issue instead it's the justification for the principle.
IE why is personalized algorithms more dangerous than general ones.
My answer (because I mostly agree with you) is that the difference is that personalized algorithms almost feel like brain hacking. And this brain hacking simply doesn't work at scale when applied to vague general algorithms.
by stale2002
3/18/2026 at 12:57:54 AM
>Basically all laws related to speech are abitrary. Can you define a clear and self-evident line between pornography and art as an example? Or do you agree with the Supreme Court that we just "know it when [we] see it"?I'm a free speech absolutist, so I personally don't find which laws already exist on the matter to be a compelling argument. If it was up to me, I'd get rid of any such laws.
>The type of HN algorithm that serves the same content to every user based off global behavior is fine in my book because it is both less exploitative of the user base and a reflection of that user base's proactive decisions in upvoting/downvoting content.
The argument hinges entirely on the relative exploitativeness of different feed algorithms, but that metric is merely asserted with no support.
by fluoridation
3/18/2026 at 1:23:46 AM
>I'm a free speech absolutistTypically free speech absolutism leads individuals into logical traps they find difficult to dig themselves out of.
But we don't even need that in this case. Private property can have all kinds of restrictions put on it based on the potential dangers and harms it causes. This in fact is one of the most common attacks on speech I see right now (Meta et el) that they will just put age requirements on sites.
by pixl97
3/18/2026 at 1:37:51 AM
>Typically free speech absolutism leads individuals into logical traps they find difficult to dig themselves out of.Yes, "free speech absolutists" tend to define these terms in ways to hide the true arbitrary nature of their beliefs. The obvious test case is do they believe in legalizing CSAM. Either they answer "yes" and ostracize themselves from almost all of society or they say "no" and have to come up with arbitrary rules why this specific content doesn't count as speech. Either way, self-applying the label is its own red flag.
by slg
3/18/2026 at 2:08:30 AM
I don't really understand what your point is.by fluoridation
3/18/2026 at 3:02:04 AM
If I understand the point correctly, it's that regulating the algorithms of Meta et al does not curtail your free speech, so it's a moot argumentby anonymars
3/18/2026 at 3:29:26 AM
I wasn't the one who brought up free speech into the discussion; slg was. That aside, whether it curtails it or not would depend on how one defines "speech". Even if the particular way in which a website displays information is not speech, I still think it would be an overreach for a government to legislate how websites are allowed to function. If I as a user want to see a feed populated by recommended content, and the site's operators want to show it to me, what business does the government have stepping into our interaction?by fluoridation
3/18/2026 at 11:57:27 AM
Cigarettes and their externalities are analogous and that's discussed over herehttps://news.ycombinator.com/item?id=47419870
I don't believe the argument was that personalized algorithmic recommendations need be forbidden per se, but that doesn't mean that should be the default, nor that companies should be able to wash their hands (under section 230) of what they promote
by anonymars
3/18/2026 at 4:54:04 PM
Like the other person said, cigarettes are not illegal. Are we really going to pretend that whatever harm TikTok causes is comparable to lung cancer?by fluoridation
3/18/2026 at 6:37:44 PM
Like the other posts you're arguing against have said, the argument is not that social media or personalized algorithms should be "illegal"And "are we going to pretend" is a non-argument that works both ways: "Are we really going to pretend individualized algorithmic social media hasn't caused harm to society on par with smoking?" would be equally unconvincing
by anonymars
3/18/2026 at 9:19:16 PM
There's no pretending, there. It just hasn't.by fluoridation
3/18/2026 at 1:58:38 AM
What do you think about the case of Lucy Connolly, who, during a riot where rioters were burning down hotels housing immigrants, tweeted that people should burn down hotels housing immigrants and was arrested for that?by direwolf20
3/18/2026 at 2:09:00 AM
I already stated what my position is. Why do you need to ask about specific cases? Are you trying to look for gotchas?by fluoridation
3/18/2026 at 1:57:20 AM
Of course Section 230 would apply to both sites, but only to the user-generated part of each site, because that's what Section 230 says.by direwolf20
3/17/2026 at 11:40:22 PM
That is not comparable because of the little you have over the algorithm for the other cases. On bandcamp, you can select the genre and a sorting criteria and have very good control over the list. But on Spotify, it’s very obscure, with things you’ve never asked for being in front even before your own library.by skydhash
3/18/2026 at 2:56:33 PM
for me, the distinction is control. If I'm filtering out things I don't like, I'm in control. If the system is filtering out items or promoting items, I think it fair it take on more responsibility.A system doesn't want your feed empty because they want your eyes, but because money. When they choose what goes into the feed, they should gain increased liability for what comes out. The risk they take on for more money. If that money is not worth it, don't recommend.
I enjoyed the internet in the beforetimes. Recommendations were limited to "this is objectively related, this is new, this is upvoted, this is by someone you follow or someone they follow, or this is randomly chosen." I still feel there is some liability there, but it is less than when it changes to "this is something we have determined we should show you based on your personal past behavior." That feels different than liking a category when the meta-categories are picked for you. Especially when those meta-categories allow for things you would not want to opt in to, like doomscroll material.
I like some of the stuff I get algorithmically. I never would have searched for a soul cover of Slim Shady, but I'm glad I found it. And I'm glad I found knot tying videos. I think there is space for fancy feeds. But I think it should come closer to being a publisher. This _will_ depress throughput creation if things all have to be monitored which will change the economies and maybe that means some businesses can't exist as they do today. I'd likely pay a subscription to a LearnTok that had curated, quality material.
by sethammons
3/18/2026 at 10:05:13 AM
1.) I do not know anyone who would particularly like netflix recommendation algorithm.2.) Netflix algorithm is not relevant to "Section 230 protections", because it does not contain any data from third parties. All of that is Netflix content.
by watwut
3/17/2026 at 11:39:59 PM
I'm paying for Netflix to do that as a feature. Instagram uses that to drive engagement to sell ads. Disabling personalized content on Netflix is a revenue-neutral choice. On Instagram, that would mean their ad revenue takes a huge dive. Apples aren't oranges.by levkk
3/17/2026 at 11:43:50 PM
Netflix does it to drive engagement as well.by krapp
3/18/2026 at 5:00:12 AM
I can get on board with it for sure.Theres a paper that studied the spread of misinformation online, back before COVID - they found that messages cascaded through more science and research oriented networks differently than they cascaded through conspiracy communities.
Popularity is not a sign of Signal. It’s a sign of being able to scratch the limbic system and zeitgeist at the same time.
For a site like HN, popularity isn’t a good predictive signal.
by intended
3/17/2026 at 11:42:18 PM
But algorithmic feeds can actually be useful for discovery of related material - I want Youtube to show me more Japanese jazz and video essays about true crime based on my watch history, I wanted Twitter to show me more accounts from writers and game developers because I follow them (before the platform went full Nazi) and I like that Facebook shows me people and information from my local area. Forcing all platforms to use only alphabetical or chronological feeds because of the exploitative way some platforms use algorithms seems awfully close to the "banning math" argument people used to use about cryptography and DRM, and it would remove a lot of legitimate use from the internet.by krapp
3/18/2026 at 12:35:23 AM
It's all about who controls the algorithm. A sensible approach would be to decouple recommendations from platforms, to treat them like plug-ins that the user must be allowed to add or disable. You want to use YouTube's recommendation algorithm on YouTube? Great, but there needs to be an off-switch and a way to change over to another provider. This is classic anti-trust stuff, breaking up a sector into interoperable pieces.by idle_zealot
3/18/2026 at 10:13:30 PM
The anti-trust argument doesn't work for me. Neither Youtube nor any other single platform represent a "sector" in the way Standard Oil or Ma Bell represented a "sector", they don't "control the algorithm" in any sense beyond implementing code on their site. Certainly not in the way that a monopoly preventing other entities from competing against it by controlling access to some physical resource. Other video hosting sites besides Youtube exist, other social media platforms exist, so competition exists.And besides, what's likely to happen is that you'll only have a few "algorithm providers" controlling access the entire web which only centralizes it even more.
by krapp
3/18/2026 at 2:10:38 AM
[dead]by qotgalaxy
3/18/2026 at 2:14:10 AM
[dead]by qotgalaxy
3/17/2026 at 11:32:02 PM
Really nice to see someone else bringing this up. Algorithmic editorial decisions are still editorial decisions. I think ultimately search and other forms of selective content surfacing should not have ever been exempt. They were never carriers. I appreciate that this would make the web as we know it unusable. I think failing to tackle this problem has will also make the web unusable, and in a worse way.by deeponey
3/17/2026 at 11:58:01 PM
> I think ultimately search and other forms of selective content surfacing should not have ever been exempt. They were never carriers. I appreciate that this would make the web as we know it unusableI can’t be the only one confused at these calls to have the government destroy things like searching the web, am I?
How is this a real idea being proposed on Hacker News, of all places? Not that long ago it was all about freedom on the Internet and getting angry when the government interfered with our right to speech online, and now there are calls to do drastic measures like make search engines legally untenable to run in the United States?
It’s also confusing that nobody calling for banning things or making the web unusable appears to be making the connection that the internet is global. If we passed laws that forced Google and Bing to shut down because they’re liable for results they index, what do you think the population will do? Shrug their shoulders and give up on the internet? Or go use a search engine from another country?
by Aurornis
3/18/2026 at 12:30:37 AM
> How is this a real idea being proposed on Hacker News, of all places? Not that long ago it was all about freedom on the Internet and getting angry when the government interfered with our right to speech onlineI can be upset about the government trying to make the world worse, and about other huge balls of power who have been making the world shitty in an ongoing fashion. Freedom of speech doesn't mean shit if a handful of people can buy up or otherwise absorb control of 90% of media and choose who gets heard. The call for regulation is an acknowledgment that the market fucked this one up. When the government threatens speech, I'll call for civil disobedience and proactive protections. When oligarchs threaten speech I'll call for regulation and punishment.
> It’s also confusing that nobody calling for banning things or making the web unusable appears to be making the connection that the internet is global. If we passed laws that forced Google and Bing to shut down because they’re liable for results they index, what do you think the population will do?
You assume that the only way to get a good, free search engine is to give control of it to some private entity. That if we don't do it in the US, people with turn to someplace else. I think you may be lacking in imagination. At a minimum, the possibility exists for nonprofit organizations to run quality search engines, but it's also possible to decouple the indexing business from the ranking provider. Google could run an index and charge for access, and ranking providers could build on top of that and recoup costs with non-tracking ads, donations, sales, whatever business model they please. Just because an unregulated market doesn't come up with a good solution doesn't mean a market under different constraints won't find a better way. And if nothing works out you always have the option of grants or a public digital infrastructure approach. There are so many things to try beyond shrugging and declaring that the market has ordained five dudes arbiters of the internet as experienced by most people.
by idle_zealot
3/18/2026 at 12:53:18 AM
> I can’t be the only one confused at these calls to have the government destroy things like searching the web, am I?if you find this distressing then i imagine you find it equally as distressing as a couple of corporations destroy something.
the reason the word *enshittification” has become so ubiquitous is because corporations are actively destroying the internet and desperately trying to convince us the internet is separate from “the real world”.
sometimes stopping a person from burning the house down is necessary. no matter how loudly they cry about their freedom to have a bonfire in the living room.
by toofy
3/18/2026 at 1:46:53 PM
What we need is quite simply a very good protocol for distributed search. It takes some storage, some bandwidth and some cpu cycles. Have people contribute those and earn queries and indexing. Make it very good but simple enough for a half decent programmer to make a lvl 1 node that can only announce it exists. Trackers, supper nodes, ban lists, ranking algo's etc etc Write server code in all the languages, have phone and desktop clients. There can be subscription based clients too so that the cpu, storage, bandwidth can be done for you by a company.This description is intentionally vague.
by 6510
3/18/2026 at 12:55:14 PM
This seems the same as news organisations choosing which news to report on, but driven by user behaviour rather than the org's employees themselves.by philipallstar
3/17/2026 at 10:43:29 PM
oddly enough the TikTok referred to here was to be shut down in the US. But then the executive branch ignored the law while it could organize handing the company over to Larry Ellison instead. But these allegations date to when the company was fully under the control of ByteDance, and not US-regulated entities at all.by zzzeek
3/17/2026 at 10:47:31 PM
> oddly enough the TikTok referred to here was to be shut down in the US. But then the executive branch ignored the law while it could organize handing the company over to Larry Ellison insteadWhich should make people think twice when they call for government regulation on speech as a solution to content they don't want other people to see.
The more you give the government power to control speech, the more they'll use those laws to further their own interests.
by Aurornis
3/17/2026 at 11:49:09 PM
Wouldn’t we need to shut down all news outlets, all the twitters and all the newspapers then? They might not be on the toxic spectrum as meta/tiktok, but are very closeby cryptoegorophy
3/18/2026 at 12:00:14 AM
There are people in this thread directly calling for us to strip protections from search engines and force them to shut down.I think a lot of this discussion has become detached from reality and we’re just entertaining some people’s impossible fantasies about shutting down the internet and returning to the past.
by Aurornis
3/18/2026 at 12:44:13 AM
Human instinct is always to ban and fight everything as soon as any change happens in society. The same biological motivation to doomscroll fuels our instincts to panic and doompost about how society is ruined unless we do [brash action].by dmix
3/18/2026 at 2:36:37 AM
Then we'll just use the Chinese apps. Or do you plan on shutting down our access to Chinese apps too?by ronnier
3/18/2026 at 3:05:13 AM
Like TikTok?by anonymars
3/18/2026 at 3:54:09 AM
"What do we do about it"Account --> Delete
by rkagerer
3/17/2026 at 11:50:26 PM
>> Meta and TikTok have no natural right to exist if they are a net negative to society.Exactly. And when we are done with them we will shut down Molson and Anheuser-Busch. Then we can go after the people who make selfy sticks. Then the company that owns that truck that cut me off last week. Basically, organization who i dislike should not be allowed to exist.
by sandworm101
3/17/2026 at 9:56:28 PM
Regulating content that makes people enraged seems like a slippery slide towards regulating any kind of "unwanted" speech. I get regulating CSAM, calls for violence or really obvious bullying (serious ones like "kill yourself" to a kid), but regulating algorithms that show rage bait leaves a lot of judgement to the regulators. Obviously I don't trust TikTok or Meta at all, but I don't trust the current or the future governments with this much power.For example, some teen got radicalized with racist and sexist content. That's bad in my opinion, as I'm not a racist or a sexist. But should racist or sexist speech be censored or regulated? On what grounds? How do we know other unpopular (now or in the future) speech won't be censored or regulated in the future? Again, as much as I'm not a racist or sexist, I don't think the government should have a say in whether a company should be able to promote speech like "whites/blacks are X" or "men/women are Y". What's next? Should we regulate speech about religion (Christians/Muslims/atheists are Z) or ethics (anti-war people or vegans are Q) or politics or drugs or sex?
The current situation is shitty, but giving too much power to regulators will likely make it way shittier. If not now, in the future, since passed regulations are rarely removed.
by diacritical
3/17/2026 at 10:42:01 PM
At least in the US the government can't regulate speech (for the most part). But what we could do is regulate recommendation algorithms or other aspects of the overall design in a way that's generalized enough to be neutral in regards to any particular speech. And such regulations don't need to apply to any entity below some MAU or other metric.Even just mandating interoperability would likely do since that would open up the floor to competitors. Many users are well aware of the issues but don't feel they have a viable alternative that satisfies their goals.
by fc417fc802
3/17/2026 at 10:51:41 PM
In theory I'm OK (kinda) with regulating the "overall design" somehow, but I don't see how it's going to work. Forced interoperability is a (very?) good idea, as it's really general, but it also doesn't address directly what the article and most comments talk about - the rage bait. I just can't imagine regulations (or "laws" or whatever the correct term is) that deal specifically with the algos that push rage bait that can't be later abused, if passed, to deal with other unpopular speech. And it seems like people want some laws to directly deal with that - the bad types of speech or algos themselves.To clarify, I use "rage bait" as an example phrase, but it includes algos that only promote engagement at any cost and other things that aren't outright dangerous, but we think are dangerous. Not, like I said, CSAM or yelling FIRE or telling people to kill themselves.
by diacritical
3/18/2026 at 12:02:41 AM
Interoperability sidesteps the issue by giving users the choice of which algorithm (or algorithm provider) to use. The majority might or might not agree with that approach - for example obviously tobacco has not been left purely to the individual's judgment in the west.Agreed, you can't regulate speech in a targeted manner while also not doing so. You're forced to find some common aspect much more general than "rage bait". Perhaps prohibiting the targeting of certain metrics? Or even prohibiting their collection in the first place.
by fc417fc802
3/18/2026 at 12:26:15 AM
> You're forced to find some common aspect much more general than "rage bait". Perhaps prohibiting the targeting of certain metrics? Or even prohibiting their collection in the first place.Can you elaborate, give some ideas, examples, etc.? What metrics? How can you define them in a consistent, safe way?
by diacritical
3/18/2026 at 12:56:33 AM
We're talking generalized metrics. I have no idea which ones - I wasn't claiming to have solved the problem. The point is that if you can identify a general characteristic that is being used in a way which disproportionately contributes to a particular outcome then you can filter on that.Estimated user age is an example of a metric largely unrelated to concerns regarding free speech. I doubt it has much relevance to the problem we're taking about here but hopefully you can imagine that prohibiting the targeting of ads or the curation of an algorithmic feed based on that metric would not be expected to unduly disadvantage any particular sort of speech.
by fc417fc802
3/18/2026 at 1:08:42 AM
> The point is that if you can identify a general characteristic that is being used in a way which disproportionately contributes to a particular outcome then you can filter on that.In a non-adversarial political context where we trust the government and the future ones, sure, but I think without any strong guardrails, we could enact a law that's good today, but will be exploited in the future.
For targeting minors with any kind of political speech - I'd love it if it wasn't legal. But that brings its own can of worms. There's enough discussion on HN on the implications of age verification, whether on how it's done technically (privacy-preserving or not (ZKP or just shady 3rd parties); FOSS or not; on the ISP, OS or app level, etc.) and whether the mere precedent could trigger additional issues down the road.
Anyway, I'd love a society where everything is perfect, but I'm afraid of what might actually happen. With a benevolent god as a permanent ruler, I'd be happy with 100% prosecution rate against all kinds of littering, hate speech and whatnot, but in reality random crimes are easier to evade than a law passed down by a malevolent government, so I'm strongly against any kind of overreach. (Because the law tomorrow could be one we must evade if we want to resist an unethical government). Someone will likely chime in with "but complete and massive overreach has never happened so far", to which I'd reply - we're close to the point where technology will let the ones in power grab that power absolutely and forever if we them grab too much in the beginning.
by diacritical
3/18/2026 at 4:37:22 AM
> In a non-adversarial political contextAn oxymoron.
> where we trust the government and the future ones
Has never and will never exist.
> we could enact a law that's good today, but will be exploited in the future.
Sure but that's how pretty much all legislation works.
> I'd love it if it wasn't legal. But that brings its own can of worms.
It's probably fine as long as you include the clause "knowingly and intentionally". That doesn't imply age verification or anything else, merely that you act on information that you have and are aware of (and that you not intentionally design systems to work around that).
Also note that I never said anything about underage users. My example was targeting based on estimated user age. So in that example the age is estimated and it is illegal to target anything based on the value. (Of course to avoid a very silly loophole you'd also need to disallow targeting based on verified age as well.)
by fc417fc802
3/17/2026 at 11:21:51 PM
> I get regulating CSAM, calls for violence or really obvious bullying (serious ones like "kill yourself" to a kid)I’ve reported videos that look like sexual exploitation, videos that call for violence and videos that promote hate (xyz people are cockroaches) and all I’ve gotten is that “it does not go against community guidelines” with a link to block the person who created them. So any concerns of “where do we draw the line” are in my opinion pointless because the bare minimum isn’t even being done.
by darth_avocado
3/17/2026 at 11:34:35 PM
I agree with your CSAM and explicit calls for violence examples - they probably should be regulated. But a few comments ago in another thread someone didn't like me calling people in the workplace who annoy me with their mindless chit chat "corporate drones". My post could be construed as promoting hate. Where do we draw the line from "cockroaches" to "drones"? Do I have to call a certain "protected class" drones for it to qualify as hate speech?What if I didn't say anything bad about a race or a sex, but said:
> I have coworkers that pester with me with their small talk about the weather every time I see them. I hate those fucking cockroaches.
That's in bad taste, sure, but should it be regulated? You may know I obviously don't hate-hate them (they're just annoying, but most of them are good people) or actually consider them cockroach-like in any meaningful aspect (they're obviously people, but with annoying tendencies). But would a regulator know the difference? What about a malicious regulator who gets paid by (ok, this is a silly example, but bear with me) the weather-talking coworker lobby to censor me? In many not-so-silly examples a regulator could silence anyone for anything (politics, sex, drugs, ethics), as long as it uses a bad word or says anything negative about anyone. I don't want to live in such a society. That much power would be abused sooner or later.
by diacritical
3/17/2026 at 11:07:20 PM
I'm sorry but are you saying it's hard to figure out what to do so let's do nothing? Banning racist and sexist content is not a slippery slope. It's just banning racist and sexist content, slope is only slippery because the salivating mouths of these social platforms grease them.Also, I don't think people are advocating censorship, they are advocating not promoting assholes. You can have your little blog and be racist on it all you want, but let's not give these people equivalent of nukes for communication.
by newswasboring
3/17/2026 at 11:23:22 PM
> are you saying it's hard to figure out what to do so let's do nothing?I'm fine with doing something, but the current "something" seems slippery.
> Banning racist and sexist content is not a slippery slope. It's just banning racist and sexist content, slope is only slippery because the salivating mouths of these social platforms grease them.
But what is "racist", exactly? See why I think it's a slippery slope and why it's ill-defined:
1. We could agree that "Let's go out and kill/enslave all the $race/$gender" is racist, but that's bad if we switch $race to any group, as it's speech that incites violence.
2. What about "$race is genetically inferior in a way (less intelligent, less athletic, more prone to $bad_behavior)"? I honestly think most differences in race/ethnicity is due to environmental factors, but what if there actually are difference in intelligence or anything like that? Should we ban speech that discusses that? Black people win running races and are great at basketball. They're prone to certain diseases, as are Caucasians or Asians. So would you ban discussing that? Or would you ban blindly asserting that $race is $Y without some sort of proof?
3. What about statements like "There are way more male bus drivers because X"? Or "men are better at Y, but women are better at Z"?
What do you think the definition of racism and sexism in this context should be? I think the line is where we incite violence towards a group, but not about discussing differences that may or may not be true.
> Also, I don't think people are advocating censorship, they are advocating not promoting assholes. You can have your little blog and be racist on it all you want, but let's not give these people equivalent of nukes for communication.
I think restricting a platform (or anyone or anything) from promoting someone IS censorship. If it's not censored, why shouldn't I be able to promote it? This honestly feels disingenuous - like "we pretend that the racist isn't censored and can have his little blog, but it's illegal to promote his little blog".
by diacritical
3/18/2026 at 10:42:42 AM
It's easy, let's start with banning 1. Obvious incitement of violence. If they can enforce just that much it would be great.by newswasboring
3/17/2026 at 11:17:44 PM
> I'm sorry but are you saying it's hard to figure out what to do so let's do nothing?That seems more reasonable than the alternative, which is to make modifications to a complex system which you aren't sure what the outcome will be. You're more likely to cause bigger problems.
by slopinthebag
3/18/2026 at 2:18:29 AM
[dead]by aaron695
3/17/2026 at 9:40:33 PM
regulation will never happen because these are instruments to control the massesby bdangubic
3/17/2026 at 10:19:49 PM
All the more reason for regulation. If people catch on to the fact that they are being manipulated and abused by the platforms to "drive engagement" they might abandon them or spend less time on them. If the government regulates these platforms so that they are safer or at least less harmful people will feel better about using them giving the government a larger platform to use to control the masses.by autoexec
3/17/2026 at 10:54:11 PM
> If people catch on to the fact that they are being manipulated and abused by the platformsI am not trying to be funny or anything but this sounds like "if only fat kid realized that eating 10 apple pies before bedtime might be the reason s/he is fat" We already know what social media platforms are doing, not to just young people but to all people.
> If the government regulates these platforms
This is like saying "congressman care about our debt so they will vote to reduce their own salaries by 90%" - the government is not going to regulate tools they are using to control the narrative/masses etc...
by bdangubic
3/18/2026 at 12:05:45 AM
Tax and heavily regulate online advertising. The root of the problem is that it is very, very lucrative to drive engagement and until you get rid of the monetary incentive, the problem will never go away.by ThrowawayR2
3/17/2026 at 9:00:20 PM
"Make the drug less good" likely isn't the answer. Nor is banning it.What caused Gen Z to drink less than millenials? Maybe Gen Z has the answer.
by cj
3/17/2026 at 9:07:39 PM
You're only allowed to drink as an adult. We're talking about letting those companies rot our brains in those first 18 years.by barbazoo
3/17/2026 at 9:50:31 PM
In my experience the 60+ demographic have had far more damage done.by XorNot
3/17/2026 at 10:21:13 PM
We just haven't seen what 60 year old ipad kids look like yet. It's not going to be prettyby autoexec
3/18/2026 at 9:02:39 AM
We can see what Trump is like, as an 80 year old radicalized by Twitter with an army at his command.by pjc50
3/18/2026 at 2:41:09 AM
Just as they were settling into middle age far-right propaganda, conspiracy, and hate "entertainment" escaped AM radio and flooded cable news and social media.They never stood a chance.
by Rapzid
3/17/2026 at 9:07:29 PM
yeah, it's called "smoking weed".by SirFatty
3/17/2026 at 9:39:33 PM
Technology, culture, legalization of pot, adtech, covid, there are a metric ton of factors that all had significant impact on both decreasing socialization and reduction in drinking. And lowering the birth rates, and the number of healthy relationships, healthy friendships, etc.I'm for legalizing all drugs, regulating the sale, ensuring quality and purity, and educating the public. Cognitive liberty is sacred - but the dip in drinking has a whole lot of causes.
A healthier society would be more social and get out and drink more, I think.
by observationist
3/18/2026 at 1:13:39 AM
Millennials love their weed, party drugs too, it took over Gen X drinking in some way.But I find Zoomers to be rather tame in terms of drinking, smoking, drugs, unsafe sex, etc... Few of the traditional vices, really.
by GuB-42
3/17/2026 at 11:19:13 PM
Inflation, mostly. And a lot of us lack social skills so they don't have many friends, thus no reason to go out and get drunk.But like, when a pint is $12 and mixed drinks are $15+ sobriety starts looking more appealing.
Source: Am gen Z.
by slopinthebag
3/17/2026 at 9:28:45 PM
Decades of science communication and real life examples of knowing (of) alcohol addictsby jqbd
3/17/2026 at 9:38:09 PM
I'd wager how expensive it has gotten plus a year or two of lockdowns which lead to a whole generation of people not going out to get wasted as soon as they're legally allowed to had way more effect.Oh, and weed being increasingly legal to consume.
by input_sh
3/17/2026 at 11:29:55 PM
I also noticed a trend that happened at my old college and a number of others that I've never seen anyone write about: the great buyout of the old college area slumlords.All the dive bars where you could black out off $10-20 I drunk at in college are gone. They all faced the wrecking ball, and were replaced in the past 10-15 years with apartments over targets and cvs and family friendly restaurants. A huge concerted effort to buy up these properties in piecemeal then destroy entire blocks at a time. I have no clue where kids at my college go to drink now. I have little interest in going back either as an alumnus as they destroyed all the places of my memories.
by asdff
3/17/2026 at 9:42:40 PM
Real life experience with alcoholics would at-best be constant over time, or be diminishing (since gen Z drinks less).Also seems like the science on whether science communication actual changes behavior doesn't point towards it being much of a cause here.
by aerodexis
3/18/2026 at 1:51:34 AM
> What caused Gen Z to drink less than millenials?Social media addiction?
by dehrmann
3/18/2026 at 4:47:23 AM
As one of said generation, I would chalk it up to instant communication creating innumerable shallow remote relationships that significantly replace time spent with others in person.by pixelmelt
3/17/2026 at 11:23:48 PM
Gen Z drinks less because alcohol isn’t enough of a fix and hard drugs are way cheaper. The answer isn’t what you’re looking for.by darth_avocado
3/18/2026 at 4:36:59 PM
> What caused Gen Z to drink less than millenials?Money.
by linhns
3/17/2026 at 9:21:25 PM
Make it legal and expensive?by fakedang
3/18/2026 at 2:03:41 AM
Gen Z never touch grass, you need to first leave the house before drinking comes up.by bdangubic
3/17/2026 at 9:00:23 PM
It’s like asking how do you get people to stop drinking alcoholAs long as there are people who don’t acknowledge or care about the health effects it will exist. If that’s a plurality of your population then you have a fundamental population problem IF you are in the group who thinks it’s bad.
Aka every minority-majority split on every issue ever.
So the answer is: live in a society governed by science. Unfortunately none exist
by AndrewKemendo
3/17/2026 at 10:17:24 PM
> So the answer is: live in a society governed by science. Unfortunately none existScience is a lagging indicator of reality. It is by definition conservative (in that it requires rigorous, repeatable data before it can label something as true). Because of that, there's usually a pretty substantial gap between human discovery and scientific consensus.
Mindfulness was discovered, as an example, to be beneficial as far back as 500 BCE. It wasn't "proven" with science until 1979.
Sometimes we just need to rely on lived experience to make important decisions, especially regulation. We can't always wait for science.
by thewebguyd
3/17/2026 at 11:54:05 PM
>Science is a lagging indicator of realityTell me what the leading indicator of reality is then
by AndrewKemendo
3/18/2026 at 5:09:35 AM
Tongue in cheek: Putting your finger in the socket 30 times before realizing it’s a bad idea yourself.Harms are a leading indicator of the limits of current reality.
by intended
3/18/2026 at 11:39:33 AM
Hypothesis testing with real world feedback… Sounds like science to meby AndrewKemendo
3/17/2026 at 10:09:54 PM
I drink, but I acknowledge and care about the health effects. I care more about how it makes me feel. Don't assume everyone who smokes or drinks alcohol or takes another type of drug just doesn't care. Why don't we ban dangerous sports like rock climbing or BASE jumping or MMA while we're at it?by diacritical
3/17/2026 at 9:03:04 PM
We handled smoking pretty well by making it cost more and banning it in public places. If tiktok was banned from official app stores it would essentially go away.by nemomarx
3/17/2026 at 9:10:13 PM
Social media addiction is much deeper than nicotine addiction. And people still smoke, see Phillip Morris stock and earnings :)by bdangubic
3/17/2026 at 10:00:55 PM
I don't think deeper is the right word. Nicotine has a physical addiction element that social media does not. You cut off social media, you at worse face some boredom and FOMO.And PM's earnings are mostly from developing countries at this point. In the US alone, the adult smoking rate has fallen nearly 73% from 1965 to now, so clearly the regulations are working.
We need to do the same for social media. People didn't quit smoking because they suddenly got more disciplined. We just made it inconvenient. The biggest start would be get rid of algorithmic feeds and "recommendations" keep it purely chronological, only from people you explicitly follow.
by thewebguyd
3/17/2026 at 10:14:26 PM
Nitpicking maybe, but nicotine isn't the main thing that makes cigarettes addictive and it's not that bad by itself. Gwern has a long article on nicotine that's worth a read [0].More importantly, why do you think society should make smoking inconvenient - more costly, more illegal or anything like that? If I'm not blowing smoke in your face, why interfere with my desire to smoke? If it's about medical bills, just let me sign a waiver that I won't get cancer treatments or whatever, and let me buy a pack of smokes for what it should cost - a few cents per pack, not a few dollars/euro.
by diacritical
3/17/2026 at 10:48:37 PM
If I can smell it, I don't really care if you're blowing it directly at me or not, it's still a pain. If you want to smoke in private in your own home and then wash your clothes after so no one can tell you're doing it, I guess that's fine, but I don't see why it also has to be cheap?by nemomarx
3/17/2026 at 11:03:04 PM
I admit I sometimes smoke near people, even if I try to move to the side. At bus stops I try to be 5-10 meters away from people, but often I don't do it and it inconveniences people. Sorry, truly. I will try to be more mindful. When I switched to e-cigs for a while a couple of years ago, I started noticing the smell of tobacco smoke. After I switched back to cigs, I stopped noticing it. Smokers don't notice it that much as they're around it often. It's not always smokers being inconsiderate, it's not realizing how it smells to others. If you let me smell the clothes of a smoker and a non-smoker, I wouldn't be able to tell the different if my life depended on it. Although I only smoke outdoors and wash my clothes regularly, so I hope my base smell isn't that offensive to non-smokers.So yeah, this comment really reminded me to not light up whenever and "try my best" to walk a few meters away, but to really think if I'd inconvenience people.
On the other hand, if I'm alone on a street and you're walking towards me so I just pass you for a second, I can't imagine that the smell would be that bad from just a casual walk-by. When I'm passing people, I hold in my smoke till I pass them.
Even if I agree that smoking outdoors is inconsiderate and annoying to others, I could still do it at home or in dedicated areas (smoking sections in bars with good ventilation, ofr example).
> I don't see why it also has to be cheap?
If we agree on the previous points, then why not let it be cheap? Tobacco is cheap to produce. Most of the price of cigarettes is artificial, to cover medical costs and whatnot. Let's say I sign a waiver that if I get sick, I either pay through the nose or don't receive treatment at all. Would you be OK with letting me buy tobacco at it's original cost (no subsidies, no artificial fees)?
Or, as a thought experiment - let's say tobacco didn't have any smell and there were 0 negative effects of second-hand smoke. Like, you wouldn't know it if I smoked near you unless you saw me. Then what would be the justification in making smoking artificially expensive for me?
by diacritical
3/17/2026 at 11:56:24 PM
If it wasn't for the impact on offer people, I think you could handle it basically like sugary drinks - there's some benefit in discouraging it for health reasons but not as much benefit comparatively, so a more modest tax is all I could really argue for, yeah. (Like how nicotine gum is treated essentially)by nemomarx
3/18/2026 at 12:19:48 AM
Since the impact is mostly annoyance (the smell) and most restaurants are either smoke-free or offer separate enclosures, why tax it at all (besides for the smell)? I am reducing my lifespan by about 8 to 10 years with smoking, sure. But why should the government force me to change that by taxing it? Why tax sugary drinks or ban or criminalize drugs other than the caffeine, nicotine and alcohol?If the idea is to make everyone be healthy, live as long as possible and be productive for as long as possible, why not ban dangerous sports, too? I'm "the government" for my dog and I don't let him do anything dangerous or stupid, but he's a dog and we're people. With the supposed free will and agency we all like.
by diacritical
3/18/2026 at 1:46:14 AM
>But why should the government force me to change that by taxing it?Because the government ends up paying for the medical treatment of a lot of smokers when they're older. And it's incredibly expensive. You can say you won't rely on government funds, but there's no way to actually opt out of Medicare for life or sign up to never be guaranteed stabilization when you show up at a hospital.
Nicotine is also notoriously addictive, which weakens the "my choice" argument.
>Why tax sugary drinks
That's totally a nanny state thing. Personally, I would mildly support it. But it's not a hill I'd die on.
>or ban or criminalize drugs other than the caffeine, nicotine and alcohol?
Hard drugs cause blight. People don't mind so much if they see a soda can on their street, but if they see a used needle they'll move. And again, any society with a safety net has an interest in preventing common causes of people falling into it.
>why not ban dangerous sports, too?
It hasn't proven to be a big problem at the population level. Hell, public health experts would love to have that problem, because it'd mean more people were exercising.
by vharuck
3/18/2026 at 2:10:19 AM
> Because the government ends up paying for the medical treatment of a lot of smokers when they're older. And it's incredibly expensive. You can say you won't rely on government funds, but there's no way to actually opt out of Medicare for life or sign up to never be guaranteed stabilization when you show up at a hospital.That's why I'd get a tattoo on my chest, if necessary, saying "Smoker!". I know that most of the price of tobacco is insurance for medical treatments. Not Medicare, as I'm not in the US, but similar. I am OK with tattooing "DO NOT STABILIZE OR CARE FOR AT ALL - SMOKER !!!1".
> Nicotine is also notoriously addictive, which weakens the "my choice" argument.
I am an adult human who participates in society and has chosen to smoke. Please treat me as an adult who has made a (bad) decision and is willing to suffer the consequences.
> sugary drinks... nanny state
Same with any drug.
> hard drugs...
People who abuse hard drugs to the point where we need to save them or others from them are most often uneducated or poor (and living in a poor neighborhoods, with all that it brings). Believe it or not, I know several people with PhDs in things like physics and biology who regularly take "hard" and/or "soft" drugs besides alcohol and nicotine. Only one needed intervention after ~10 years and it was because of pre-existing psychological issues that led him to abuse the drugs. I and lots of people I know who lead normal lives can list more 3- or 4-letter abbreviations of stuff we've tried than a HN comment will let us fill. Or maybe I'm exaggerating a bit, not sure, but you get the point.
If you look at a poor neighborhood, you'll see a lot more people with drug problems. Not because richer people don't do drugs, but because it's not an escape plan, it's not some random impure thing you get and because it's done within a safe place. It's a social issue, not a drug issue. Work on solving poverty and education, not on making us drug users feel like criminals for trying new stuff or on making our drugs more expensive. Whether it's legal like alcohol or nicotine, or illegal a psychedelic, a benzo, weed, an opioid, a dissociative or anything else, it's a drug. I am an adult. Let me experience my adulthood like I want to. You don't take drugs and that's fine, but please understand that you have no fucking idea what you're missing if you're doing it correctly. Literally anything you've likely experienced, like romantic relationships, climbing mountains, orgasms and so on, is categorically and qualitatively different from the amazing things you can experience on various drugs.
by diacritical
3/17/2026 at 11:02:24 PM
> You cut off social media, you at worse face some boredom and FOMO.I wish this was true but I know tens of people that quit smoking and (besides myself) know 1/2 of another person that quit social media. drunk at NYE two years I offered $10k to a group of 25 people to delete all social media apps from their phones for 60 days - still have that $10k in my account. I think quitting social media is around the same as getting off hard drug addiction (like hard, hard, hard one - opioid, heroin etc...) and maybe even tougher that that - for most people.
> People didn't quit smoking because they suddenly got more disciplined. We just made it inconvenient.
I want to believe this! I just haven't personally experienced this at all (I am in my 6 decade on Earth so plenty of time around). I don't know single person that stopped smoking because they could not burn one inside restaurants/clubs/... or because it costs $18/pack or any of that. 18 year old person has very little "regulation" when it comes to smoking. Little inconveniences to move 25 feet away from the building isn't much of a deterrent IMO.
I am subjective on the matter of social media, I know that. But I am educated in its evil and would for instance never let my kid be on any social media as long as she is under my roof. This has already cause significant challenges for her (and my wife and I) but also it is an amazing learning experience to overcome silly social obstacles...
by bdangubic
3/17/2026 at 10:15:49 PM
I think it's also partially due to smoking being more and more considered disgusting, not just inconvenient. The peer pressure of "don't do this very stinky disgusting thing around me" must have at least a little to do with declining smoking rates. Back in the 80s, most people didn't have the guts to say "Hey, don't smoke around me, it's gross!" but plenty of people do today.We need to culturally consider Social Media use to be disgusting or at least something to be ashamed of.
by ryandrake
3/18/2026 at 3:42:07 AM
The irony is that social media trends are making smoking cool again.by slopinthebag
3/17/2026 at 9:02:06 PM
Not a fan of conflating personal enjoyment of a vice with promoting hatred.by brookst
3/17/2026 at 9:04:46 PM
It's like how do you get people to stop letting their kids drink alcohol.Everyone knows what the dangers of alcohol are now. We need to get reliable data one can base policy on and then let the public health system do their thing. Maybe not every health authority but enough of them to protect the species at large. Then we'll get social media out of schools, away from young people, vulnerable folks, etc.
by barbazoo
3/17/2026 at 11:10:10 PM
Why would someone want to get other people to stop drinking alcohol?by slopinthebag
3/18/2026 at 4:31:53 AM
> "What do we do about it?"I'd suggest something like banning algorithmic amplification - your feed is posts of people you follow and nothing else. But that's not what will happen. What will happen is there will be [1] vague laws about preventing vague "harm", written to give legal teeth to the Overton window. Not in those words, but companies that would go against it will be mired in lawfare, while those that comply will be allowed to grow.
And if you complain, they'll motte-and-bailey you - you're not in favor of "harm", are you? We're not an authoritarian speech police, we only seek to protect people from "harm".
[1] Or rather, are - see https://en.wikipedia.org/wiki/Online_Safety_Act_2023
by like_any_other
3/18/2026 at 12:07:51 PM
My IG feed is largely taken over by congressional members videos, crazy $#!t the president (and his crew) says, and the keystone cops. And boy howdy is there a lot of rage inducing behavior going on.I feel more informed than if I was only listening to NPR.
That said, I stay away from anything that’s produced—sound track, too many cuts/edits, talking head commentary. I guess in this context, if I’m going to be driven to emotional anxiety, it’s going to be from something that happened or something someone said, and not the internet’s interpretation.
You can’t “produce content” that I will watch _as news_. It has to be in some real way happening (with some deference to Rashomon).
by xtiansimon
3/17/2026 at 9:54:12 PM
The people who were voted to power (across the globe, not just the US) to do something about it are stuck getting their dopamine kicks posting garbage on the same platforms. It’s truly a terrible timeline we are in.by techpression
3/18/2026 at 3:20:42 AM
We are in a real life cyberpunk dystopia. Without any of the fun parts.by yoyohello13
3/17/2026 at 9:01:54 PM
Regulate it. Laws, consequences, etc.by toomuchtodo
3/17/2026 at 9:13:01 PM
Laws appear to have fallen out of fashion. And a disturbing proportion of the loudest people like it. Then you have those who ought to know better but are attention-seeking, selfish assholes who somehow find it «interesting» or think they adhere to «principles».The latter category know who you are. You downvoted this comment.
by bborud
3/17/2026 at 9:18:13 PM
I recently provided guidance to state legislators, with that guidance making its way into law in regards of balcony solar. If you don’t think that making law works, I would encourage you to get involved somewhere that means something to you.It turns out that if you present as an honest, non-interested party, people will call you and ask you for your advice. I do admit that the ease of this is going to be a function of the people you are up against and the subject being regulated. My point of this comment is: default to action. “You can just do things.”
by toomuchtodo
3/20/2026 at 1:01:03 PM
The issue is not laws, or the making of them (although Congress hasn't exactly been overly productive). The issue is the executive branch not abiding by laws.by bborud
3/17/2026 at 10:24:46 PM
> Laws appear to have fallen out of fashion.Laws are very much fashionable, but only for us. “Rules for thee but not for me” is what's in season right now.
by autoexec
3/17/2026 at 10:25:54 PM
Importantly, seasons change.by toomuchtodo
3/18/2026 at 3:18:11 AM
I’m going to bet we do nothing and continue to complain instead.by yoyohello13
3/18/2026 at 2:02:00 PM
Oh 100%, but i can dream...by bigfishrunning
3/18/2026 at 2:10:20 AM
What do we do? We treat platforms with algorithmic news feeds as publishers not platforms in the Section 230 sense.Think about it this way: imagine if you took a million random posts or videos. You would find a wide range of political views, conspiracy theories and so on. Whatever your position on any of those issues, you could find content pushing those views.
So if your algorithm selects and distributes content that fits your desired views and suppresses content that opposes your views, how are you different from a random publisher who posts content with those exact same views?
This is kind of like the "secret third thing" of Section 230 where you get all the protections of being a platform and all the flexibility of being a publisher and we need to close that loophole. Let platforms choose which one they are.
Another example: if I create a blog and write a post that accuses my local mayor of being a drug addict and a pedophile, I can be sued for defamation. You can try the journalism defense but it won't shield you from defamation. Traditoinal media outlets are normally very careful about what they publish for this reason.
But what if I run Facebook or Twitter and one of my users says the exact same thing? Well I'm just a platform. I have a libel shield. But again, my algorithm can promote or suppress that claim. Even if I have processes to moderate that content, either by responding to a court order to take it down and/or allowing users to flag it and then take it down myself with human or AI moderation, the damage can't really be rolled back.
We've let tech companies get away with "the algorithm" being some kind of mysterious and neutral black box that just does stuff and we have no idea what. It's complete bullshit. Every behavior of such an algorithm reflects a choice made by people, period. And we need to start treating this as publishing.
by jmyeet
3/18/2026 at 12:57:40 AM
[dead]by wotsdat
3/17/2026 at 11:39:36 PM
[flagged]by noAnswer
3/17/2026 at 9:02:15 PM
[flagged]by b65e8bee43c2ed0
3/17/2026 at 9:18:22 PM
Nothing is inherently illegal. Laws are created in response to an undesireable outcome - murder wasn't illegal until it was made illegal.by munk-a
3/17/2026 at 9:19:03 PM
[flagged]by sigmar
3/17/2026 at 11:13:06 PM
Consuming social media doesn't have an inescapable negative impact on other people, unlike burning leaded fuel. In the same way that eating junk food doesn't. Should we ban junk food? What else do you want to ban from others just because it has a risk profile you personally don't feel comfortable with?by slopinthebag
3/18/2026 at 2:55:12 AM
> Consuming social media doesn't have an inescapable negative impact on other peopleYou don't think large portions an entire generation(s) getting cooked by social media doesn't have negative externalities that impact society as a whole?
by Rapzid
3/18/2026 at 3:23:32 AM
I don't think anybody has the moral authority to regulate such second-order effects.Should unhealthy food be banned because of the second-order effects of obesity? What about mandatory church / religious service? After all, I judge that atheism has negative second-order effects on the world. Where would I get this moral authority from?
by slopinthebag
3/18/2026 at 5:07:57 AM
For fuzzy second order effects you have fuzzy second order impacting laws.You increase disclosure norms, you increase monitoring and you ensure marketing and packaging norms that disclose the potential risks.
You aren’t allowed to put up booze and cigarette stores near schools. These are not new problems that humanity has never encountered before.
by intended
3/18/2026 at 5:49:36 AM
> You aren’t allowed to put up booze and cigarette stores near schools.Huh? Where? In many countries grocery and convenience stores sell both. When I was in school I could have gone across the street to get both. Everywhere I've travelled it's been even more accessible. The only place I've seen these restrictions are in very religious places, which are not analogous to morality in any way.
Lets play a little though experiment: Is it okay for me and my friend to send each other messages over the internet? Can we send images and videos? What about a group chat with all of our friends? What if our neighbourhood joins in? What if our city joins in? What if our country joins in?
Can you identify the precise step in which this becomes unallowable? Can you articulate a logical reason why it's unallowable, but the previous steps are fine?
Can you do this without it becoming a subjective question about your personal moral values?
This is the problem with laws and mandates. They can't just be based on your own subjective feelings. And as humans, we have very different thoughts and feelings on what is good and bad, what should be allowed an unallowed. Furthermore, many things are perfectly legal despite causing harm. If I reject someone's advances and they suffer negative mental consequences, have I violated their rights? They've suffered harm after all. To whom are their obligations for?
There can be claimed "fuzzy second order effects" to every single human action. Authoritarians believe they are smarter than everyone else and have the right to enforce their subjective and often incorrect opinions on everyone else. In another country, on another topic, this would be about something else - maybe religion. This does not form a solid legal basis for anything.
by slopinthebag
3/17/2026 at 9:32:02 PM
[flagged]by b65e8bee43c2ed0
3/17/2026 at 10:50:05 PM
I wonder where folks like this came from, and at what point did people who associate themselves with hacker culture decide that censorship is great.The OG hackers thought of censorship as network damage that needed to be routed around.
People who support censorship always think of themselves as smarter than the rest. Dunning-Krueger however would suggest something different.
by temp8830
3/17/2026 at 11:37:07 PM
I posted above that social media related issues are a problem, and then a bunch of posts accused me of wanting to make it illegal. I never suggested that and I actually don't support censorship, I just wish some people I know didn't spend so much of their time bummed out about social media.by bigfishrunning
3/17/2026 at 9:06:28 PM
> >"What do we do about it?"> nothing. if it isn't illegal, it isn't illegal.
Are you suggesting that because something isn't illegal, it shouldn't be illegal?
Are you perhaps a representative of the Triangle Shirtwaist Factory?
by DonaldPShimoda
3/18/2026 at 8:23:22 AM
Please don't post shallow dismissals or flamebait on HN. The guidelines make it clear we're trying for something better here.by tomhow
3/17/2026 at 9:10:39 PM
I'm not suggesting that it should be illegal, I'm just seeing this monetization of bad vibes and wondering how we can have less bad vibes. Pump the brakes a little.by bigfishrunning
3/17/2026 at 9:05:04 PM
Things that are not illegal can and should be made illegal if need be.Many things were not illegal before they became illegal.
by surgical_fire
3/17/2026 at 9:35:01 PM
okay. go ahead and make "conspiracy theories" illegal.by b65e8bee43c2ed0