5/23/2025 at 12:56:48 AM
> if you want to sculpt the kind of software that gets embedded in pacemakers and missile guidance systems and M1 tanks—you better throw that bot out the airlock and learn.But the bulk of us aren't doing that... We're making CRUD apps for endless incoming streams of near identical user needs, just with slightly different integrations, schemas, and lipstick.
Let's be honest. For most software there is nothing new under the sun. It's been seen before thousands of times, and so why not recall and use those old nuggets? For me coding agents are just code-reuse on steroids.
Ps. Ironically, the article feels AI generated.
by padolsey
5/23/2025 at 1:06:26 AM
I don't mind your rebuttal of the article, but to suggest that this particular article is AI generated is foolish. The style the author presents is vivid, uses powerful imagery and metaphor and finally, at times, is genuinely funny. More qualitatively, the author incorporates a unique identity that persists throughout the entirety of a long form essay.All of that is still difficult to get an LLM to do. This isn't AI generated. It's just good writing. Whether you buy the premise or not.
by throwaway314155
5/23/2025 at 4:26:30 AM
Yeah, it feels a very different style of unhinged to LLMs. I can’t yet imagine an LLM producing such a beautiful and contextually-appropriate sentence as “They’ll be out there trying to duct-tape horses to an engine block, wondering why it doesn’t fly.”by chrismorgan
5/23/2025 at 5:39:44 AM
This brought tears to my eyes:But you—at your most frazzled, sleep-deprived, raccoon-eyed best—you can try. You can squint at the layers of abstraction and see through them. Peel back the nice ergonomic type-safe, pure, lazy, immutable syntactic sugar and imagine the mess of assembly the compiler pukes up.
Amazing
by nick3443
5/23/2025 at 6:10:14 PM
> The style the author presents is vivid, uses powerful imagery and metaphor and finally, at times, is genuinely funny. More qualitatively, the author incorporates a unique identity that persists throughout the entirety of a long form essay.This is incredible you would say that because you'll never guess what it reads like.
by a0123
5/24/2025 at 5:30:55 PM
I'm glad I'm not the only one to have thought that...by petemir
5/23/2025 at 3:37:22 AM
I got slightly LLM vibes for the first few paragraphs ngl. It became very clear it wasn't very fast, though.by queenkjuul
5/29/2025 at 4:43:23 AM
This rebuttal of the rebuttal does feel earily AI. Perhaps an injection of cynicism?by yerushalayim
5/23/2025 at 1:08:45 AM
Maybe he meant it was long. Some people seem to think that long walls of text is how you spot AI slop.by boxed
5/23/2025 at 2:59:17 AM
And god forbid you use an emdash these days.by nozzlegear
5/23/2025 at 11:52:49 AM
I’ve recently been accused of using ChatGPT because I wrote a message with formal language and bullet points.by xigoi
5/24/2025 at 4:05:55 AM
I'm unapologetic about using em-dashes. They're in my toolbox, and I reach for them. I can and do manually type em-dashes manually in both Linux and Microsoft Windows using the key-codes.Anyone who is basing their opinion of the provenance of a piece of writing on whether em-dashes are used, then that opinion is emminently discardable lemming-babble.
by MrVandemar
5/23/2025 at 2:21:07 AM
[flagged]by aozgaa
5/23/2025 at 2:39:04 AM
[flagged]by cedric_h
5/23/2025 at 3:50:06 AM
Fundamentally, mission critical low level code isn't the kind of software i want to write anyway. I don't find AI tools super useful for most of the same reasons as the author, but i do kind of get tired of the idea that if you're not writing systems in C you're not really programming.I like writing front end code. I'm probably never going to have a job where i need or would even want to write a low level graphics library from scratch. Fine, I'm not red-eyed 3am hacker brained, but I'm passionate and good at what i do. I don't think a world where every person working in software has the author's mentality is realistic or desirable.
by queenkjuul
5/23/2025 at 12:38:08 PM
> I like writing front end code. I'm probably never going to have a job where i need or would even want to write a low level graphics library from scratch. Fine, I'm not red-eyed 3am hacker brained, but I'm passionate and good at what i do.Keep that spirit! All I want from coworkers is genuine interest and curiosity. Not everyone is going to find investigating Linux’s networking stack interesting, just as not everyone is going to find making beautiful pure CSS animations interesting. I think one of the greatest mistakes the tech industry did was to create “full stack,” as though someone would have interest and skill in frontend, backend, and infra. Bring back specialists; we’re all better for it.
by sgarland
5/25/2025 at 6:19:32 PM
I think it’s a fine label. I have skill and interest in all of those areas. Some things requires specialists, most jobs out there don’t.by _rutinerad
5/26/2025 at 1:17:46 PM
I sometimes imagine, even before LLMs, how much human-written code still matters, like the DNA that has survived evolution. It's got to be a super low percentage.The trend with Autocomplete Industrialization (AI) is just speeding up the creation of shanty towns of code, as opposed to architecturally robust foundations. The survivability of code is dropping because of the Copilotz. But perhaps this rapidity of creating crude solutions will increase the chances of something truly significant?
p.p.s there are too many times "it's" should be "its" in that blog post to have been AI-generated. That's a different kind of irony IMO, especially since that's just tricky English syntax (the thing AI is supposed to be good at). Maybe the author used AI to come up with the snarky metaphors. I asked ChatGPT for a sarcastic meaning for AI that starts with Autocomplete :)
by Fuhrmanator
5/23/2025 at 2:57:47 PM
from TFA:> Maybe you’ll never write the code that keeps a plane in the sky. Maybe you’ll never push bits that hold a human life in the balance. Fine. Most don’t. But even if you're just slapping together another CRUD app for some bloated enterprise, you still owe your users respect. You owe them dignity.
by agos
5/23/2025 at 3:42:37 PM
> you still owe your users respect. You owe them dignity.This is moral grandstanding. You owe your customers a good product at a low cost. If you don't use a tool that can lower costs, you are wronging your users and will go out of business.
Handcrafted CRUD will go the same way as handcrafted anything; an expensive niche hobby.
by Ferret7446
5/24/2025 at 12:11:44 AM
> This is moral grandstanding.I disagree. For the kind of relationships I want with other humans, respect and dignity are my end of the bargain.
by WarOnPrivacy
5/23/2025 at 3:27:04 PM
> you still owe your users respect. You owe them dignity.what does that even mean?
Users don't care if code is written by a human or AI; they care that the code gives them what they need, hopefully in a fairly pleasant manner.
by jonaustin
5/23/2025 at 1:34:04 AM
Yeah, and the article talks about those ways in which AI is useful. Overall, the author doesn’t have a problem with experts using AI to help them. The main argument is that we’re calling AI a copilot, and many newbies may be trusting it or leaning on it too much, when in reality, it’s still a shitty coworker half the time. Real copilots are actually your peers and experts at what they do.> Now? We’re building a world where that curiosity gets lobotomized at the door. Some poor bastard—born to be great—is going to get told to "review this AI-generated patchset" for eight hours a day, until all that wonder calcifies into apathy. The terminal will become a spreadsheet. The debugger a coffin.
On the other hand, one could argue that AI is just another abstraction. After all, some folks may complain that over-reliance on garbage collectors means that newbies never learn how to properly manage memory. While memory management is useful knowledge for most programmers, it rarely practically comes up for many modern professional tasks. That said, at least knowing about it means you have a deeper level of understanding and mastery of programming. Over time, all those small, rare details add up, and you may become an expert.
I think AI is in a different class because it’s an extremely leaky abstraction.
We use many abstractions every day. A web developer really doesn’t need to know how deeper levels of the stack work — the abstractions are very strong. Sure, you’ll want to know about networking and how browsers work to operate at a very high level, but you can absolutely write very nice, scalable websites and products with more limited knowledge. The key thing is that you know what you’re building on, and you know where to go learn about things if you need to. (Kind of like how a web developer should know the fundamental basics of HTML/CSS/JS before really using a web framework. And that doesn’t take much effort.)
AI is different — you can potentially get away with not knowing the fundamental basics of programming… to a point. You can get away with not knowing where to look for answers and how to learn. After all, AIs would be fucking great at completing basic programming assignments at the college level.
But at some point, the abstraction gets very leaky. Your code will break in unexpected ways. And the core worry for many is that fewer and fewer new developers will be learning the debugging, thinking, and self-learning skills which are honestly CRITICAL to becoming an expert in this field.
You get skills like that by doing things yourself and banging your head against the wall and trying again until it works, and by being exposed to a wide variety of projects and challenges. Honestly, that’s just how learning works — repetition and practice!
But if we’re abstracting away the very act of learning, it is fair to wonder how much that will hurt the long-term skills of many developers.
Of course, I’m not saying AI causes everyone to become clueless. There are still smart, driven people who will pick up core skills along the way. But it seems pretty plausible that the % of people who do that will decrease. You don’t get those skills unless you’re challenged, and with AI, those beginner level “learn how to program” challenges become trivial. Which means people will have to challenge themselves.
And ultimately, the abstraction is just leaky. AI might look like it solves your problems for you to a novice, but once you see through the mirage, you realize that you cannot abstract away your core programming & debugging skills. You actually have to rely on those skills to fix the issues AI creates for you — so you better be learning them along the way!!
Btw, I say this as someone who does use AI coding assistants. I don’t think it’s all bad or all good. But we can’t just wave away the downsides just because it’s useful
by anon7000
5/23/2025 at 1:43:27 AM
> Btw, I say this as someone who does use AI coding assistants. I don’t think it’s all bad or all good. But we can’t just wave away the downsides just because it’s usefulIsn't this just the rehashed argument against interactive terminals in the 60s/70s (no longer need to think very carefully about what you enter into your punch cards!), debuggers (no longer spending time looking carefully at code to find bugs), Intellisense/code completion (no need to remember APIs!) from the late 90s, or stackoverflow (no need to sift to answer questions that others have had before!) from the 00s? I feel like we've been here before and moved on from it (hardly anyone complains about these anymore, no one is suggesting we go back to programming by rewiring the computer), I wonder if this time it will be any different? Kids will just learn new ways of doing things on top of the new abstractions just like they've done for the last 70 years of programming history.
by seanmcdirmid
5/23/2025 at 1:53:18 AM
Interactive terminals didn’t write code for you, and also unlocked entirely new paradigms of programs. Debuggers, if anything, enabled deeper understanding. Intellisense is in fact a plague and should not exist. Stack Overflow, when abused, is nearly as bad as AI.by sgarland
5/23/2025 at 5:43:57 AM
I think we should just agree to disagree. All of those opened up new paradigms for programming, and so will AI even if we aren’t quite sure what that new paradigm is yet. There will always be people claiming the old-fashioned way is better, like Dijkstra’s famous complaint about kids not using punch cards anymore and how that meant they weren’t learning how to be good programmers.by seanmcdirmid
5/23/2025 at 7:39:13 AM
We're actually quite certain what this new paradigm is, because some poor souls are already practicing it: slop coding. You prompt-whisper poorly-defined changes to make, and if the machine chokes on its own vomit along the way, you delete everything and try again.It feels reasonable, consistent to see it as another "old man yells at the skies" scenario, but I do think it's unprecedented for a machine to automate thought itself on an unbounded domain and with such unreliability. We know calculators made people worse at mental math, but at least calculators don't give you off-by-one errors 40–60% of the time with no method of verification.
The reason why we haven't lost literacy to Speakwrites and screen readers is because they required more time and effort than doing it yourself. With AI, the supposed time savings are obvious: you don't put hours into reading the source to write an essay, you just ask ChatGPT, you don't learn programming fundamentals, you just ask for a script that does X, Y, and Z, etc... It feels like a good choice, but you're permanently crippling you education, both in a structured course and in the wild, and the supposed oracle is a slot machine, costing you $avg_tokens*$model_rate a pull. The poor news is slot machines sell.
by pona-a
5/23/2025 at 2:29:06 PM
I don’t think we’ve really figured out how to use AI in coding yet, vibe coding doesn’t really feel like it’s it. Vibe coding and just generating code like how some people claim intellisense is just to save on typing, when it’s actually a great in-situ browse what members can be selected on a value of a certain type.There is definitely a way to abuse AI in programming, but it doesn’t seem to be very compelling and I don’t think it will get people who do that very far (eg relying on intel sense to save on typing rather than just learning how to type).
ChatGPT is a great writing tool if you already know how to write. You can curate and modify on top of it, allowing you to write your paper faster with the same amount of quality. But again people just using it to write essays or paper without knowing how to write themselves aren’t going to get good results.
by seanmcdirmid
5/23/2025 at 3:54:07 PM
I understand what you mean, but let's be honest: this is a rare kind of tool that's more useful to feign competence, deceive yourself and others, and produce industrial volumes of slop than it is to do better work. IntelliSense is just dynamic documentation, which has existed since at least Emacs — it doesn't do the thinking for you.Professional tools, from music notation and art to typesetting and programming, are about translating an image inside your mind into something physical. When you know what you're doing, the lack of an interpretable mapping between prompt and generation means you spend more time trying to describe what you want to write instead of just writing it. I'd be much happier with code generation if it could take a formal specification and either return an error or something that provably implements it. Maybe interpretability research will one day change that, but as they are now, they're simply not tunable or reliable enough to be used as tools. And yes, prompting doesn't count when they increasingly disregard your instructions.
There are many valid uses: I have a tiny WolframAlpha-like script that lets me type some basic computations and the LLM translates that to Python. I sometimes use LLM completions to get some inspiration when writing prose — while I usually discard them, they still help me think. They can often act as better grammar checkers than LanguageTool, and they make a nice companion to smaller translation models, both having their own quirks.
But most of this doesn't need these larger and larger models; I haven't yet tried, but I think fine-tuning some mid-size open-weights LLMs will yield similar or better results. The industry sold the public AGI, not better auto-complete, a fuzzy parser, or a smarter translator, and now they're burning growing piles of money on a saturated research direction to maintain the delusion singularity is 5 months away.
by pona-a
5/23/2025 at 11:12:29 AM
Yes, and as a reminder, this is now an issue much wider than programming :by BlueTemplar
5/23/2025 at 4:20:02 AM
> On the other hand, one could argue that AI is just another abstractionI, as a user of a library abstraction, get a well defined boundary and interface contract — plus assurance it’s been put through paces by others. I can be pretty confident it will honor that contract, freeing me up to not have to know the details myself or second guess the author.
by jgraettinger1
5/23/2025 at 1:07:40 AM
It’s funny because I made a few funny clips (to my taste) on Google Whisk and figured, hey why not, let’s make a TikTok. Did you know that all of TikTok is full of millions of ai generated stuff or other people just copying each others stuff? I really thought there was something to this “original creation” stuff.We are all so simply reproducible. No one’s making anything special, anywhere, for the most part. If we all uploaded a TikTok video of daily coding, it would be the same fucking app over and over, just like the rest of TikTok.
Elon may have be right all along, there’s literally nothing left to do but goto Mars. Some of us were telling many of you that the LLMs don’t hallucinate as much as you think just two years ago, and I think the late to the party crowd need to hear us again - we humans are not really necessary anymore.
!RemindMe in 2 years
by ivape
5/23/2025 at 2:56:48 AM
There's literally genocide and war going on, solve thatby whattheheckheck
5/23/2025 at 3:58:02 AM
Don't encourage them. They'll just build an AI to do the genocide faster.by queenkjuul
5/23/2025 at 7:49:31 AM
The genocide is televised, it appears no one cares.by ivape