2/3/2026 at 4:04:29 PM
The great thing about LLMs being more or less commoditized is switching is so easy.I use Claude Code via the VS Code extension. When I got a couple of 500 errors just now I simply copy pasted my last instructions into Codex and kept going.
It's pretty rare that switching costs are THAT low in technology!
by davedx
2/3/2026 at 4:06:55 PM
It’s not a moat, it’s a tiny groove on the sidewalk.I’ve experienced the same. Even guide markdown files that work well for one model or vendor will work reasonably well for the other.
by shermantanktop
2/3/2026 at 4:26:11 PM
Which is exactly why these companies are now all focused on building products rather than (or alongside) improving their base models. Claude Code, Cowork, Gemini CLI/Antigravity, Codex - all proprietary and don't allow model swapping (or do with heavy restrictions). As models get more and more commoditized the idea is to enforce lock-in at the app level instead.by paxys
2/3/2026 at 4:36:16 PM
FWIW, OpenAI Codex is open source and they help other open source projects like OpenCode to integrate their accounts (not just expensive API), unlike Anthropic who blocked it last month and force people to use their closed source CLI.by alecco
2/3/2026 at 5:02:47 PM
Gemini CLI is open source too, though I think the consensus is it's a distant third behind Claude Code and Codexby gundmc
2/3/2026 at 4:55:48 PM
The classic commoditize your complements.by _aavaa_
2/3/2026 at 4:38:22 PM
I only integrate with models via MCP. I highly encourage everybody to do the same to preserve the commodity statusby bloppe
2/3/2026 at 4:30:05 PM
Using "low cost" and LLM's in the same sentence is kind of funny to me.by LetsGetTechnicl
2/3/2026 at 5:24:32 PM
The switching cost is so low that I find it's easier and better value to have two $20/mo subscription from different providers than a $200/mo subscription with the frontier model of the month. Reliability and model diversity are a bonus.by bgirard
2/3/2026 at 5:15:07 PM
I genuinely don't know how any of these companies can make extreme profit for this reason. If a company makes a significantly better model, shouldn't it be able to explain how it's better to any competitor?Google succeeded because it understood the web better than its competitors. I don't see how any of the players in this space could be so much better that they could take over the market. It seems like these companies will create commodities, which can be profitable, but also incredibly risky for early investors and don't make the profits that would be necessary to justify the evaluations of today.
by harrisi
2/3/2026 at 5:20:26 PM
> If a company makes a significantly better model, shouldn't it be able to explain how it's better to any competitor?No. Not if it's not trained on any materials that reveal the secret sauce on why it's better.
LLM's don't possess introspection into their own training process or architecture.
by crazygringo
2/3/2026 at 5:58:37 PM
That's my point. Anything that could exist that's significantly "better" would be able to share more about its creation. And anything that could be significantly better would have to be capable of "understanding" things it wasn't trained on.by harrisi
2/3/2026 at 7:03:26 PM
That's not true. There are a million ways to be "significantly better" that don't involve knowledge about the model's creation. It can be 10x or 100x or 1000x more accurate at coding, for example, without knowing a single thing more about its own internal training methodology.by crazygringo
2/3/2026 at 4:18:24 PM
> It's pretty rare that switching costs are THAT low in technology!Look harder. Swapping usb devices (mouse,…) takes even less time. Switching wifi is also easy. Switching browser works the same. I can equally use vim/emacs/vscode/sublime/… for programming.
by skydhash
2/3/2026 at 4:27:21 PM
Switching between vim <-> emacs <-> IDEs is way harder than swapping a USB (unless you already know how to use them).by pchristensen
2/4/2026 at 3:07:57 AM
I don't know, USB A takes 3 attempts to plug in for some reason.by cozzyd
2/4/2026 at 3:36:03 AM
Sometimes four!by bmitc
2/3/2026 at 4:33:08 PM
good point, they are standards, by definition society forced vendors to behave and play nice together. LLMs are not standards yet, and it is just pure bliss that english works fine across different LLMs for now. Some labs are trying to push their own format and stop it. Specially around reasoning traces, e.g. codex removing reasoning traces between calls and gemini requiring reasoning history. So don't take this for granted.by ahmadyan
2/3/2026 at 5:22:37 PM
I dunno. Text is a pretty good de facto standard. And they work in lots of languages, not just English.by crazygringo
2/3/2026 at 4:57:47 PM
You make it sound like lock-in doesn't exist. But your examples are cherry picked. And they're all standards anyway, their _purpose_ was for easy switching between implementations.by amelius
2/3/2026 at 4:27:31 PM
Most people only have one mouse or Wi-Fi network. If my Wi-Fi goes down, my only other option is to use a mobile hotspot, which is inferior in almost every way.by NicuCalcea
2/3/2026 at 5:20:54 PM
> Most people only have one mouseTell me you're not a Mac user without telling me you're not a Mac user...
by oneeyedpigeon
2/3/2026 at 6:55:37 PM
Thankfully, not a Mac user, or even a wireless mouse user.by NicuCalcea
2/3/2026 at 5:23:02 PM
Huh?by crazygringo
2/3/2026 at 5:27:18 PM
The default Apple mouse needs a backup because it still cannot be charged and used at the same time.by oneeyedpigeon
2/3/2026 at 4:38:51 PM
I mean sublime died overnight when vscode showed up.by whatever1
2/3/2026 at 5:38:16 PM
on some agents you just switch the model and carry on.by falloutx
2/3/2026 at 6:24:00 PM
Except Kimi Agent via website is hard to replace - I tried the same task in Claude Code, Codex, and Kimi Agent - the results for office tasks are incomparable. The versions from Anthropic and OpenAI are far behind.by benterix