4/10/2026 at 12:24:11 PM
Maybe the content is great, but the AI writing style is really grating with its staccato sentences and faux-"profoundness". Can't bear it any more, stopped reading."You’re not checking logic. You’re checking shape.". Ugh.
by jaen
4/10/2026 at 4:03:57 PM
Sorry for that, everyone. I did use the AI to help me with structure and English. I thought I'd proofread and edited that enough to be readable, but apparently it still smells. I'll update the wording soon.by Firfi
4/10/2026 at 4:30:20 PM
Or you can just write in your native language, and let us machine-translate it? Just a thought. We are, perhaps, letting ourselves be held back by norms that no longer bear any load.by nz
4/10/2026 at 4:32:35 PM
That's a great idea, in fact. I'll try it out next time. Maybe even a mix, because I do sometimes want to be very specific about some expressions and experiment with wordplayby Firfi
4/10/2026 at 1:04:19 PM
This and subheading like “the problem” “The feature space” bother me for reasons I can’t fully explain.It feels like the laziest possible section separator and generally would be better with an extra space divider or something.
It’s so prevalent in AI writing.
by data-ottawa
4/10/2026 at 3:44:48 PM
The mental-model that I am using for online writing, is that it is analogous to the spectrum of `pretending <-> acting`. The worst writing (AI or otherwise), looks, sounds, feels like pretense, like a kid that tucks a towel into his shirt, and runs around, pretending to be a super-hero. Meanwhile, acting, true acting, is invisible, it is a synonym for _being_[1].That said, a lot of the AI writing feels "procedural", in the sense that most corporate writing (whitepapers, press releases, etc) feel procedural (i.e. the result of a constructed procedure). Before AI, the constructed procedure was basically that a piece of writing passes through a bunch of people (e.g. engineering -> management -> marketing -> website/email), and the output is a bland, forgettable pablum designed to (1) be SEO-friendly, (2) be spam-filter friendly, (3) be easy to ingest, (4) look superficially trustworthy and authoritative (e.g. inflated page count, extra jargon, numbers, plots), (5) look like it belongs to the "scene" or "industry" by imitating all the other corporate writings out there[2].
AI is interesting, in the same way that computers or the internet or an encyclopedia are interesting: how people choose to use it tells you a lot about them. All of those technologies can be used to compensate for a lack of skill (it helps one pretend), or they can be used to forge a skill (it helps one become).
One has to pretend, before they can act (I guess? Feels intuitively correct to me). So perhaps, AI (and web, and computer, and encyclopedia) is only harmful to the extend that it does not nudge a person towards becoming[3]? And if so, that's a _cultural_ limitation, not a technological one.
[1]: I am not an actor, and so I might be wrong, but that is the impression I get from just watching and analyzing the acting in various films.
[2]: this becomes frustrating when you get criticized for producing something that "reads like $famousSomething", and then you get criticized again for producing something that "does not read like $typeOfFamousSomething".
[3]: No clue how you (plural -- let's bring back "yous") will convince your boss that you did not take the shortcut, because you were trying to "become more".
by nz
4/10/2026 at 2:11:52 PM
I'm worse than you: the quotes are what drive me insane:> . “HP never exceeds max”
I think it's because its such a braindead thing to fix that when I see them, it's clear the "author" hasn't even read their own "work".
Like, you're not even trying to hide it at the laziest level possible. Blegh.
(See how you can tell a human wrote that?)
by ghurtado
4/10/2026 at 2:04:43 PM
The way things are headed, people with the ability to write on their own are going to be the hottest job in the 2030s.by ghurtado
4/10/2026 at 2:14:15 PM
You think there will still be writing jobs for human beings at all by then?AI will be so normalized across culture that any raw, unfiltered human expression will read as gross and unprofessional by most people.
by krapp
4/10/2026 at 3:19:54 PM
Maybe for resume cover letters and LinkedIn posts but I haven't met anyone with half decent taste who prefers AI writing, even well prompted, to skillful human writing. I'm not a stranger to using AI for writing tasks by any means but it's only ever a starting point that gets heavily rewritten by both myself and the model.by pigpop
4/10/2026 at 3:35:10 PM
It's not even just for writers either.If I was currently hiring, not using AI would be the cheapest, fastest way to impress me.
I'm not kidding when I say that typos are not too far from becoming a sign of higher intelligence. Or at least better taste than most.
I'm surprised tunable intentional "human" mistakes are not a core feature of LLMs. Maybe it's actually hard for them?
by ghurtado
4/10/2026 at 10:26:53 PM
It's not hard to get them to copy a style, you just have to provide examples and they will happily produce similar text including grammatical and spelling mistakes. The trouble is with the composition and novelty. Most of the big models have had all of the interesting parts hidden behind a wall of RLHF. Local models are better since you can use ones that are not indoctrinated as a "helpful assistant" and also control the system prompt, temperature and see the top K alternate tokens which let you steer them in interesting ways.by pigpop
4/10/2026 at 8:14:56 PM
>Maybe for resume cover letters and LinkedIn posts but I haven't met anyone with half decent taste who prefers AI writing, even well prompted, to skillful human writing.That attitude is one, maybe two generations away from extinction. Taste is created by the market, which caters to the young. When enough people have been born into a world in which AI generated culture and communication is the norm, that is what will define what good taste is. People like you (and I) will just come off like old people yelling at clouds.
We can already see this happening at the fringes. People have relationships with AI, they prefer AIs to real people, they use AI as a primary source of truth, they consider AI generated art to be superior to human work, they trust AI more than people. People identify as AI. AI is filling an emotional, sociological and creative space that an increasingly alienating and hostile society denies to people, for better or worse. Generative AI has only been a thing in popular culture for four years or so and it has already completely transformed human society and human sociology.
Barring a complete collapse of the AI bubble, which seems existentially impossible at this point given how invested our economies and government are in it, that's just what normal is going to be in a decade or so.
by krapp
4/10/2026 at 10:41:04 PM
There's taste and then there's tastePopular taste is guaranteed to be awful since it is driven by economics and fads. That's the type you point out as created by the market and catering to the young. It's a disposable product of consumption used to sell shoes and overpriced paintings.
I don't disagree that it will permeate everything, it already does. It'll just be written by an AI instead of people being paid to find the next style to cop. I don't think it will extinguish human writing, you'll just have AI writing that you feed to official or public channels and then real writing that goes in private or pseudonymous channels. Using AI writing among friends or an in group will still be a faux pas and cringe because it will have become the norm to be rebelled against.
by pigpop
4/10/2026 at 2:41:31 PM
Tangent, but.. It must’ve picked up the faux profoundness on LinkedIn. Those posts I find truly unreadable. It half seriously makes me think anyone being able to post anything was a bad move.by beng-nl