5/19/2025 at 5:25:28 AM
When a person can't do something because it exhausts their patience, we usually describe it not by saying the task is difficult but that it is tedious, repetitive, boring, etc. So this article reinforces my view that the main impact of LLMs is their abilities at the low end, not the high end: they make it very easy to do a bad-but-maybe-adequate job at something that you're too impatient to do yourself.by BrenBarn
5/20/2025 at 2:58:42 AM
I agree with this more daily.Converting a dictionary into a list of records when you known that's what you want ... easy, mechanical, boring af, and something we should almost obviously outsource to machines. LLMs are great at this.
Deciding whether to use a dictionary or a stream of records as part of your API? You need to internalize the impacts of that decision. LLMs are generally not going to worry about those details unless you ask. And you absolutely need to ask.
by perrygeo
5/20/2025 at 8:30:14 AM
> Deciding whether to use a dictionary or a stream of records as part of your API? You need to internalize the impacts of that decision. LLMs are generally not going to worry about those details unless you ask. And you absolutely need to ask.OTOH, unless I've been immersed in the larger problem space of streaming vs. batching and caching, and generally thinking on this level, there's a good chance LLMs will "think" of more critical edge cases and caveats than I will. I use scare quotes here not because of the "are LLMs really thinking?" question, but because this isn't really a matter of thinking - it's a matter of having all the relevant associations loaded in your mental cache. SOTA LLMs always have them.
Of course I'll get better results if I dive in fully myself and "do it right". But there's only so much time working adults have to "do it right", one has to be selective about focusing attention; for everything else, quick consideration + iteration are the way to go, and if I'm going to do something quick, well, it turns out I can do this much better with a good LLM than without, because the LLM will have all the details cached that I don't have time to think up.
(Random example from just now: I asked o3 to help me with power-cycling and preserving longevity of a screen in an IoT side project; it gave me good tips, and then mentioned I should also take into account the on-board SD card and protect it from power interruptions and wear. I haven't even remotely considered that, but it was a spot-on observation.)
This actually worries me a bit, too. Until now, I relied on my experience-honed intuition for figuring out non-obvious and second-order consequences of quick decisions. But if I start to rely on LLMs for this, what will it do to my intuition?
(Also I talked time, but it's also patience - and for those of us with executive functioning issues, that's often a difference between attempting a task or not even bothering with it.)
by TeMPOraL
5/20/2025 at 4:24:27 AM
> easy, mechanical, boring af, and something we should almost obviously outsource to machinesThat’s when you learn vim or emacs. Instead of editing character wise, you move to bigger structures. Every editing task becomes a short list of commands and with the power of macros, repeatable. Then if you do it often, you add (easily) a custom command for it.
by skydhash
5/20/2025 at 4:51:35 AM
Speaking of tedious and exhausting my patience… learning to use vim and emacs properly. I do like vim but I barely know how to use it and I’ve had well over a decade of opportunity to do so!Pressing TAB with copilot to cover use cases you’ve never needed to discover a command or write a macro for is actually kinda cool, IMO.
by andyferris
5/20/2025 at 9:00:43 AM
AI selling itself at the high end is much like car companies showing off shiny sports cars.by stuaxo