I agree with the article's general thrust: use AI if you want to, don't use it if you don't want to, whether or not you'd want to will probably change as AI continues to evolve, and most people seem to be being pushed to use AI in the dumbest ways imaginable.----
I rather strongly disagree with the framing around the environmental impacts, though; the article would make a much stronger point if it resisted the urge to peddle the same “muh water and electricity” disinformation that gets parroted all over the place by people who can't be bothered to put numbers into the context of other numbers.
For example:
> A single DGX B200 AI server is rated to consume 14,300 watts of electrical power at peak. You can cram about four of these on a rack if you like to live on the edge, and these four units might draw something like 200 amps of current combined. For a point of comparison, a typical single-family home in the United States will have wires from the utility company that are thick enough to provide a 200 amp service.
Cool, and how many households' worth of AI queries would that single B200 (let alone the rack of 'em) be able to handle? Probably a lot more than any individual household could ever hope to produce per second, even assuming a household consisting entirely of hardcore AI stans (let alone someone like the author, or like myself, who uses AI sparingly). Each of those servers is handling requests from thousands upon thousands of users; those power and water requirements get amortized over such a large quantity of requests (and people making them) that if you've ever eaten a single hamburger in your entire life then you've done more harm to the environment than hundreds (if not thousands) of those AI queries.
This all comes after this quip in the margins:
> You think you’re just gonna self-host an open weight model like GLM-5 on your personal hardware and cut out the hosting costs? Well, alright, hope you have 1,727 GB of VRAM lying around.
and like… the author does understand that not everyone needs such a large model with such a large VRAM requirement, right? Or that VRAM itself ain't even strictly necessary (it just happens to make things faster — which is more important for a server handling requests from thousands of users than it is for my laptop handling requests from exactly one user: me)? That's indeed part of the issue the author correctly identifies with people using AI in seemingly the dumbest way possible: that dumbness includes the demand for instantaneous responses, and the consequent demand for throwing more and more VRAM and SSDs at the problem, when “just make a cup of coffee while the LLM ‘thinks’ about what you asked of it” is a perfectly workable approach. As I'm typing out this comment, I've got Olmo 3.1 on this same exact machine doing a bunch of thinking about how to respond to me asking it “How much wood would a woodchuck chuck if a woodchuck could chuck wood?”¹, and it's totally fine that it's taking multiple minutes because there are other things I can do while I wait.
This all ain't to say that we shouldn't care about AI's power and water usage. We should absolutely be pushing for better efficiency. That includes acknowledging that there are options besides “throw more and more VRAM at it and hope for the best”; the article instead prefers to assume that the big beefy servers are the only option, dismissing the notion of self-hosting with little thought, and that dismissal does the article's broader point a disservice.
----
The discussion around AI being considered a “tool” also rubbed me the wrong way a bit:
> This unlocks a common refrain from the booster class: “A true craftsperson uses every tool at their disposal!” Which, if you think about it for more than three seconds, is ridiculous on its face. Gotta dig some holes for fence posts? Okay! Bring along every shovel on the truck, the Ditch Witch, a box of ANFO and the Bagger 293. Have the people who echo this kind of stuff ever built anything in the physical world? Your average craftsperson has one real good compound miter saw that they use for basically every cut on the jobsite. They’ll use it until it breaks down, then they’ll replace it with a newer model of substantially the same thing. In what world is constantly switching tools for the sake of switching tools a remotely smart use of time?
That's pretty blatantly a strawman, and seemingly the exact opposite of how even the most vibe-codey of vibe-coders use AI. They're largely using AI as that miter saw; they might switch out blades/models for a given job, but at the end of the day it's the same tool. That's indeed yet another part of that “people using AI in the dumbest ways imaginable” problem that's otherwise correctly-identified: AI maximalists having a hammer called ChatGPT and seeing everything as a nail.
And also: who cares whether or not someone brings along every shovel + the Ditch Witch + the ANFO + the Bagger 293 if it's easy enough to bring them all? That's only a problem to the extent that carrying one tool comes at the expense of one's ability to carry another tool. If you've got a big enough truck to carry all that gear around, and you're okay with taking the time to load and unload it all, then fuck it, might as well full send — and then if there happens to be a boulder blocking the path of your fence, then it's a good thing you have that ANFO handy, right?
And of course, most software developers ain't doing their work in a pickup truck in the middle of nowhere (though some are, and that's fucking rad). Most are doing their work at their desks, in their offices or homes, wherein they're probably in close proximity to the entirety of their collection of tools. Hell, even if they are doing their work in a pickup truck in the middle of nowhere, the vast majority of the tools they need are probably already present (or could readily be made present) on whatever laptop they're bringing along for the job. Toby and Lyle don't need to worry about the logistics of carrying their tools (in particular Lyle's trusty lathe) because they do their jobs in a workshop wherein those tools already live; I don't need to worry about the logistics of carrying around my compilers and editors and manpages and such (or even an LLM!) because I do my job on a laptop wherein those tools already live.
----
¹ For the record, Olmo 3.1 concluded (like most models do these days) that “If a woodchuck could chuck wood, it would chuck as much as it could—but given its actual habits, it would probably just dig a very efficient burrow instead.”