4/13/2026 at 1:13:06 PM
I’m reasonably convinced this is the best argument against LLMs. It’s the same reason Open is in OpenAI’s name. The understanding that centralizing the ownership of these tools is going to transform the world is widespread. That’s why the investment is so high. If power and wealth isn’t concentrated into these AI labs the investment isn’t worth it. Which means we have to ask ourselves if we want that. There’s plenty of futures which include LLMs and don’t include the centralization but they require a departure from our current trajectory. There was also no guarantee that programming and computing would become free like it is today.by roxolotl
4/13/2026 at 9:00:19 PM
The best argument against is they're just another scheme to prop up data center companies.Use an LLM with the equivalent knowledge of Linux kernel and tect editor? Or git clone them.
It's another state management scheme being sold to politicians and elder investors who don't know any better. Big tech 100% relies on elder abuse.
by yabutlivnWoods
4/13/2026 at 8:19:18 PM
> There's plenty of futures which include LLMs and don’t include the centralization but they require a departure from our current trajectory.I don't think that's true at all. It's pretty clear that local models are the future of agentic coding, and everyone's been moving towards that goal.
It's also becoming clear that current models are much bigger than they really need to be. New research indicates that most transformer models can be shrunk significantly and still perform the same.
We definitely aren't there yet, but models that run on a single consumer GPU are getting better at a pretty fast pace. Model size keeps going down, efficiency keeps going up, and compute keeps getting faster and cheaper.
I really don't see a future where enormous datacenters are the only way to run a coding agent. Huge models might continue to be more performant, but the gap between that and a local model is closing quick.
by estimator7292