3/3/2026 at 1:35:46 PM
"The future belongs to whoever understands what they just shipped."Perfect summary.
It's like we invented a world where you can finally, _finally_ speedrun an enormous legacy codebase and all patted ourselves on the back like that was a good thing.
by afry1
3/3/2026 at 1:43:35 PM
I like that. These AIs are legacy codebase generators. Nobody knows how it works and everyone is afraid to touch it.by 2OEH8eoCRo0
3/3/2026 at 1:54:50 PM
Anything in production is legacy; I'm pretty sure it happens as soon as the code is shipped regardless of who wrote it.by silverquiet
3/3/2026 at 1:56:46 PM
True, but I think there's another dimension implied: how many devs are left that understand the code? Being able to start at zero is a fascinating surprise (compared to five years ago).by alanbernstein
3/3/2026 at 7:42:09 PM
At least with a person, you can say that there's one person in your org that understands the code after they write it and submit it for review.Maybe they stick around for a while, maybe they move on to another job, but they were THERE at some point. They have a name. You can ask them questions about what they did. And hey, they still exist in the real world too, so you can get in touch with them even after they leave if you need to.
AI powered development is like a guy shows up, gets hired for 90 seconds, writes part of a feature, and dies instantly once the code hits the screen.
by afry1
3/3/2026 at 2:19:54 PM
a comment I cannot stop thinking about is "we need to start thinking about production as throw away" Which is a wild thought to think about when I think about on my career. We had so many dbs or servers that we couldn't touch because they were a special snowflakeby kennethops
3/3/2026 at 2:13:54 PM
Yup. AI can't automate long-term responsibility and ownership of a product. It can produce output quicker but somebody still has to be responsible to the customer using said product. The hard limit is still the willingness of the human producing the code to back what's been output.by allenu
3/3/2026 at 9:25:49 PM
I agree with most of the article, right up to the point where the assumption is that AI will make things worse.We have reached a point of complexity and short-termism where it's standard practice to shove a huge, barely tested mass of Python, JavaScript, shell scripts, and who knows what else inside a docker container and call it done. Complete with hundreds or thousands of intractable dependencies, many of which we know ~nothing about and thousands of lines of esoteric configurations for servers we have barely any hope of even getting to run optimally, let alone securely.
Most software has been awful for a while.
Already, with AI:
- We can build everything in a statically typed, analysable, and memory safe language like Rust[0], even bits that really have to interact with the shell or wider OS and would previously have been shell scripts
- We can insist on positively deranged amounts of testing, at every level
- We can easily cut the number of dependencies in our code by >50%, in many cases more like 90%
- We can do the refactor as soon as it becomes obvious that it would be a good idea
- We can implement quality of life admin and monitoring features that would otherwise sit on the backlog for eternity
- We can educate ourselves about what we've built[1] by asking questions about our codebase, build tools to understand the behaviour of our systems, etc.
So yes, I agree that "The Future Belongs to Whoever Understands What They Shipped", but unlike the author I am somewhat optimistic[2]. There is more opportunity than ever to build and understand extremely high quality software that does not accept technical debt, corner cutting to meet deadlines, or poor quality (in design or implementation), for those that engineers who are knowledgeable enough and willing to embrace the new tools.
And AI, and the tooling around it, is only getting better.
[0] or Go or even TypeScript, but there's precious little reason not to pick Rust for most use cases now
[1] of course we need to choose to, and many won't…
[2] of course, there'll also be near-infinite valueless slop and some people will get sucked into that, but this seems little different to regurgitated SEO spam, short form video, and all the non-AI enshittification we already put up with, and perhaps AI will help more of us do a better job of avoiding it
by barnabee
3/3/2026 at 2:19:14 PM
We are speedrunning legacy "codebases" all the time. Or do you conjure up your own pickaxe, mine your own minerals, produce your own electricity, and construct your own computers and networks first before you go off to develop an application? Would you even know how to do those things? That is all enormous legacy codebase that we speedrun all the time. Just add one more to it.by iammjm
3/3/2026 at 7:58:01 PM
When I use a library someone understood it when they shipped it. It also had a long stabilisation period where bugs were fixed in it. When I use an LLM, potentially nobody understands what was just shipped and it has had no time to stabilise.by discreteevent
3/3/2026 at 3:11:20 PM
I sure don't.But when I'm using all of those things (pickaxe, mineral mine, power station, internet network hub), I know that there was a thinking human being that took some measure of human care and consideration when creating them. And that there are people on the other side of the economic transaction to talk to or hold accountable when something goes wrong.
by afry1
3/3/2026 at 7:11:51 PM
You're talking about layers of abstraction, OP was talking about an ever ballooning mass of code in the same layer of abstractionby GeoAtreides
3/3/2026 at 2:51:54 PM
That’s all legacy but none of it is speedrunning.If we could conjure pickaxes and electric power plants in a single day, that would be speedrunning.
by scared_together