2/17/2026 at 4:41:42 PM
This is a good statement of what I suspect many of us have found when rejecting the rewriting advice of AIs. The "pointiness" of prose gets worn away, until it doesn't say much. Everything is softened. The distinctiveness of the human voice is converted into blandness. The AI even says its preferred rephrasing is "polished" - a term which specifically means the jaggedness has been removed.But it's the jagged edges, the unorthodox and surprising prickly bits, that tear open a hole in the inattention of your reader, that actually gets your ideas into their heads.
by barrkel
2/17/2026 at 5:35:19 PM
I think that mostly depends on how good a writer you are. A lot of people aren't, and the AI legitimately writes better. As in, the prose is easier to understand, free of obvious errors or ambiguities.But then, the writing is also never great. I've tried a couple of times to get it to write in the style of a famous author, sometimes pasting in some example text to model the output on, but it never sounds right.
by svara
2/18/2026 at 1:59:37 AM
> I think that mostly depends on how good a writer you are. A lot of people aren't, and the AI legitimately writes better.Even poor writers write with character. My dad misspells every 4th word when he texts me, but it’s unmistakably his voice. Endearingly so.
I would push back with passion that AI writes “legitimately” better, as it has no character except the smoothed mean of all internet voices. The millennial gray of prose.
by datsci_est_2015
2/19/2026 at 12:39:54 PM
Oh god no, trust me, I'm an academic. I'd rather read an AI essay than the stuff some of my students write.by surfacedetail
2/18/2026 at 10:50:46 AM
AI averages everything out, so there's no character left.Similar thing happens when something is designed by a committee. Good for an average use, but not really great for anything specific.
by zigzag312
2/18/2026 at 4:44:07 PM
Haskell was a success story of design by committee (please correct me if I'm wrong).by bananaflag
2/17/2026 at 9:20:24 PM
It depends how you define "good writing", which is too often associated with "proper language", and by extension with proper breeding. It is a class marker.People have a distinct voice when they write, including (perhaps even especially) those without formal training in writing. That this voice is grating to the eyes of a well educated reader is a feature that says as much about the reader as it does about the writer.
Funnily enough, professional writers have long recognised this, as is shown by the never-ending list of authors who tried to capture certain linguistic styles in their work, particularly in American literature.
There are situations where you may want this class marker to be erased, because being associated with a certain social class can have negative impact on your social prospects. But it remains that something is being lost in the process, and that something is the personality and identity of the writer.
by aaplok
2/17/2026 at 6:50:19 PM
> A lot of people aren't, and the AI legitimately writes better.It may write “objectively better”, but the very distinct feel of all AI generated prose makes it immediately recognizable as artificial and unbearable as a result.
by littlestymaar
2/17/2026 at 5:43:28 PM
I find most people can write way better than AI, they simply don’t put in the effort.Which is the real issue, we’re flooding channels not designed for such low effort submissions. AI slop is just SPAM in a different context.
by Retric
2/17/2026 at 7:42:24 PM
You may be in a bubble of smart, educated people. Either way, one of the key ways to "put in the effort" is practice. People who haven't practiced often don't write well even if they're trying hard in the moment. Not even in terms of beautiful writing, just pure comprehensibility.by andrewflnr
2/17/2026 at 8:14:11 PM
I may be in a bubble of smart people, but IMO AI consistently far worse than many high school works I’ve read in terms of actual substance and coherent structure.Of course I’ve had arguments where people praise AI output then I’ve literally pointed out dozens of mistakes and they just kind of shrug saying it’s not important. So I acknowledge people judge writing very differently than I do. It just feels weird when I’d give something a 15% and someone else would happily slap on a B+.
by Retric
2/17/2026 at 9:06:53 PM
My experience has been(ordered from best to worst)
1. Author using AI well
2. Author not using AI
3. Author using AI poorly
With the gap between 1 and 2 being driven by the underlying quality of the writer and how well they use AI. A really good writer sees marginal improvements and a really poor one can see vast improvements.
by JamesBarney
2/17/2026 at 5:53:05 PM
I am really conflicted about this because yes, I think that an LLM can be an OK writing aid in utilitarian settings. It's probably not going to teach you to write better, but if the goal is just to communicate an idea, an LLM can usually help the average person express it more clearly.But the critical point is that you need to stay in control. And a lot of people just delegate the entire process to an LLM: "here's a thought I had, write a blog post about it", "write a design doc for a system that does X", "write a book about how AI changed my life". And then they ship it and then outsource the process of making sense of the output and catching errors to others.
It also results in the creation of content that, frankly, shouldn't exist because it has no reason to exist. The number of online content that doesn't say anything at all has absolutely exploded in the past 2-3 years. Including a lot of LLM-generated think pieces about LLMs that grace the hallways of HN.
by lich_king
2/17/2026 at 6:17:19 PM
Even if they “stay in control and own the result”, it’s just tedious if all communication is in that same undifferentiated sanded-down language.by layer8
2/17/2026 at 6:50:08 PM
[dead]by marbro
2/17/2026 at 5:59:07 PM
I think it’s essential to realize that AI is a tool for mainstream tasks like composing a standard email and not for the edges.The edges are where interesting stuff happens. The boring part can be made more efficient. I don’t need to type boring emails, people who can’t articulate well will be elevated.
It’s the efficient popularization of the boring stuff. Not much else.
by baxtr
2/18/2026 at 9:21:58 AM
> The edges are where interesting stuff happens. The boring part can be made more efficient. I don’t need to type boring emails, people who can’t articulate well will be elevated.I think that boring emails should not be written. What kind of boring emails do you NEED to be written, but not WANT to write? Those are exactly the kind of email that SHOULD NOT be passed through an LLM.
If you need to say yes/no. You don't want to take the whole email conversation and let LLM generate a story about why you said yes/no.
If you want to apply for a leave, just make it optimal "Hi <X>, I want to take leave from Y to Z. Thanks". You don't want to create 2 pages of justification for why you want to take this leave to see your family and friends.
In fact, for every LLM output, I want to see the input instead. What did they have in mind? If I have the input, I can ask LLM to generate 1 million outputs if I really want to read an elaboration. The input is what matters.
If I have the input, I can always generate an output. If I have the output, I don't know what was the input (i.e. the original intention).
by anon-3988
2/17/2026 at 6:04:13 PM
It contributes to making “standard” emails boring. I rather enjoy reading emails in each sender’s original voice. People who can’t articulate well aren’t elevated, instead they are perceived to be sending bland slop if they use LLMs to conceal that they can’t express themselves well.by layer8
2/17/2026 at 6:22:55 PM
[dead]by dingnuts
2/17/2026 at 5:55:09 PM
I think it is also fairly similar to the kind of discourse a manager in pretty much any domain will produce.He lacks (or lost thru disuse) technical expertise on the subject, so he uses more and more fuzzy words, leaky analogies, buzzwords.
This maybe why AI generated content has so much success among leaders and politicians.
by folbec
2/17/2026 at 8:41:03 PM
Every group want to label some outgroup as naively benefiting from AI. For programmers, apparently it's the pointy haired bosses. For normies, it's the programmers.Be careful of this kind of thinking, it's very satisfying but doesn't help you understand the world.
by coke12
2/17/2026 at 5:01:37 PM
> But it's the jagged edges, the unorthodox and surprising prickly bits, that tear open a hole in the inattention of your reader, that actually gets your ideas into their heads.This brings to mind what I think is a great description of the process LLMs exert on prose: sanding.
It's an algorithmic trend towards the median, thus they are sanding down your words until they're a smooth average of their approximate neighbors.
by devmor
2/17/2026 at 4:51:10 PM
Mediocrity as a Serviceby gdulli
2/17/2026 at 5:14:39 PM
I liked mediocrity as a service better when it was fast food restaurants and music videos.by co_king_5
2/17/2026 at 11:40:40 PM
artificial mediocrityby DuperPower
2/17/2026 at 11:38:04 PM
no but its bad writing It repeats information, It adds superfluous stuff, doesnt produce more specific forms of saying things, you are making It sounds like its "too perfect" when its bland because its artificial dumbness not artificial intelligenceby DuperPower
2/17/2026 at 8:51:52 PM
Bryan Cantrill referred to it as "normcore" on a podcast, and that's the perfect description.by piker
2/18/2026 at 3:54:14 AM
Well said. In music, it's very similar. The jarring, often out of key tones are the ones that are the most memorable, the signatures that give a musical piece its uniqueness and sometimes even its emotional points. I don't think it's possible for AI to ever figure this out, because there's something about being human that is necessary to experiencing or even describing it. You cannot "algorithmize" the unspoken.by johnnienaked
2/17/2026 at 4:46:24 PM
I'm sure this can be corrected by AI companies.by amelius
2/17/2026 at 4:52:00 PM
The question is… why? What is the actual human benefit (not monetary).by yoyohello13
2/18/2026 at 7:12:49 AM
IME, in prose writing, arguing with LLM can help a newer writer to gather 'the facts' (to help with research) and 'the objections to the facts' (same result) to anticipate an initial approach to the material. This can save a lot of organizational time. After which, newer writer can more confidently approach topics in their own voice.by 8bitsrule
2/17/2026 at 9:08:01 PM
If AI wrote and thought better by default then I wouldn't have to read the AI slop my co-workers send me.by JamesBarney
2/17/2026 at 4:48:36 PM
Just let my work have a soul, please.by q3k
2/17/2026 at 5:31:48 PM
That is NOT possible.by AreShoesFeet000
2/17/2026 at 5:39:36 PM
Why not?by q3k
2/17/2026 at 5:48:47 PM
Because even though at work it looks like you’re tasked with creating use values, you’re only there as long as the use values you create can be exchanged in the market for a profit. So every humane drive to genuinely improve your work will clash with the external conditions of your existence within that setting. You’re not there to serve people, create beautiful things, solve problems, nu-uh. You’re there to keep capital flowing. It’s soulless.by AreShoesFeet000
2/17/2026 at 6:12:33 PM
Unless you work in the public sector, non-profit or charity.by Angostura
2/17/2026 at 6:50:49 PM
To think that “non-profit” work is actually non-profit work is just to not have grasped the nature of labor. You have to ask yourself: Am I producing use values for the satisfaction of human needs or am I working on making sure the appropriation of value extraction from the production of use values continues happening?In some very extreme cases, such as in the Red Cross or reformist organizations, your job looks very clear, direct, and “soulful”. You’re directly helping desperate people. But why have people gotten into that situation? What is the downstream effect of having you helping them. It’s profit. It’s always profit. You’re salvaging humanity for parts to be bought and sold again. It doesn’t make a dishonest work. It’s just equally soulless.
by AreShoesFeet000
2/18/2026 at 1:24:01 PM
Your argument appears to be that if you redefine all of humanity to be mere grist for a capitalist machine, you can then redefine any altrustic act, as a measure to extract more profit.Truly feat of semantic legerdemain
by Angostura
2/18/2026 at 4:05:25 PM
I don’t define anything. The truth is just that there’s no profit extraction without charity work. I’ve done lots of it. If you’ve done it, you know too.As dark as it may seem to strip romantism out of which you call humanity, not only there isn’t a just salary for those who bear the weight of the machine, but also there’s isn’t even a salary per se.
If for you humanity is just doing seemingly nice guy work without question, call me a monster.
by AreShoesFeet000
2/18/2026 at 4:52:31 PM
> The truth is just that there’s no profit extraction without charity work.I'm not actually sure what you mean by this, so I can't really assess its truthiness
> not only there isn’t a just salary for those who bear the weight of the machine, but also there’s isn’t even a salary per se.
Or this - what do you mean?
>If for you humanity is just doing seemingly nice guy work without question, call me a monster.
Not even clear what you mean by this either.
by Angostura
2/18/2026 at 5:09:14 PM
My adversary has accused me of sophistry. As if I’m just a crafter of kaleidoscopes. I’m just giving back the compliment by calling out their romanticism.Charity work can bring momentary fulfillment to a person. I’m not reducing humanity by situating it within the machine. You even have the right to reject the material proposition that charity work is a piece that composes the totality of the machine. But eventually all truth will be self evident, so let’s leave it to the reader.
by AreShoesFeet000
2/18/2026 at 8:03:42 PM
I’m not your adversary, I’m just trying to understand your point.Your original assertion was that ‘ you’re only there as long as the use values you create can be exchanged in the market for a profit.’
When I suggested that non-profit or public sector jobs could certainly have soul, your responses were pretty incomprehensible.
Can you explain your point clearly and succinctly?
by Angostura
2/19/2026 at 2:34:37 AM
Because you’re aiding exploitation either way. It’s the same machine, just another part of it.by AreShoesFeet000
2/19/2026 at 12:26:44 PM
Right. So it's not just work - any good or altruistic act, will by definition only act to stoke the machine.It's certainly a way of thinking, I suppose
by Angostura
2/19/2026 at 1:20:18 PM
Incorrect. It’s mostly just work.by AreShoesFeet000
2/19/2026 at 2:21:17 PM
So if I carry out hip replacement surgery, at my own cost its good?But if the NHS pays me to carry out hip replacement surgery - funded from tax revenue, but free to the patient, it's bad?
by Angostura
2/19/2026 at 3:52:11 PM
This is not a moral judgement. It doesn’t even matter from which pocket the money is coming from.by AreShoesFeet000
2/17/2026 at 4:59:49 PM
Eh, it's not __that__ simple.by amelius
2/17/2026 at 5:21:36 PM
It is, just don’t use a thing with no soul like ai if soul is what you’re after.by ses1984
2/17/2026 at 5:34:53 PM
The point is that he may not using AI in any shape or form, Regardless, AI scrapes its work without explicit consent and then spits it back in "polished" soul free form.by vasvir
2/17/2026 at 5:26:39 PM
Great comment. It really is that simple.by co_king_5
2/17/2026 at 4:54:47 PM
[flagged]by co_king_5