3/24/2026 at 9:27:40 PM
This might sound like snark, but I truly don’t mean it that way.I think what’s interesting about AI, and why there’s so much conversation, is that in order to be a good user of AI, you have to really understand software development. All the people I work with who are getting the most value out of using AI to deliver software are people who are already very high-skilled engineers, and the more years of real experience they have, the better.
I know some guys who were road warriors for many years —- everything from racking and cabling servers, setting up infrastructure, and getting huge cloud deployments going all the way to embedded software, video game backends, etc. These guys were already really good at automation, seeing the whole life cycle of software, and understanding all the pressure points. For them, AI is the ultimate power tool. They’re just flying with it right now. (All of them also are aware that the AI vampire is very real.)
There’s still a lot to learn, and the tools are still very, very early on, but the value is clear.
I think for quite a few people, engaging with AI is maybe the first time ever in their entire career they are having to engage with systems thinking in a very concrete and directed way. Consequently, this is why so many software engineers are having an identity crisis: they’ve spent most of their career focusing on one very small section of the overall SDLC, meanwhile believing that was mostly all there was that they needed to know.
So I think we’re going to keep talking for quite a while, and the conversation will continue to be very unevenly distributed. Paradoxically, I’m not bored of it, because I’m learning so much listening to intelligent people share their learnings.
by _doctor_love
3/24/2026 at 9:48:55 PM
Hey, I don't think this sounded like snark at all. Super grounded take.> I think what’s interesting about AI, and why there’s so much conversation, is that in order to be a good user of AI, you have to really understand software development.
This I agree with completely. You can see it in the difference between a prompt where you know exactly what you want and when things are a little woolley. A tool in the hands of a well trained craftsperson is always better used.
> So I think we’re going to keep talking for quite a while Me neither, and to be clear I'm okay with that. This was mostly a rant at the lack of diversity of discourse.
by jakelsaunders94
3/24/2026 at 10:01:14 PM
Thanks friend! Appreciate it.Agree, the diversity of the discourse is not great. There's a lot of "omg I just got started waaauw" articles out there along with "we're all gonna die!" stuff. And then a few seams of very excellent insight.
Deep research at least helps with dowsing for the knowledge...
by _doctor_love
3/24/2026 at 10:18:22 PM
your HN handle is one of my top 10 fav tracks: https://www.youtube.com/watch?v=q2RSniyYNSc{heart}
by artur_makly
3/25/2026 at 12:41:12 AM
And here I was expecting this...by mindcrime
3/25/2026 at 2:45:44 AM
And here I was expecting this...https://www.youtube.com/watch?v=Q0iuMByFjCQ&list=PLEouLkiLHd...
by RyanOD
3/25/2026 at 12:56:58 AM
This would be a more compelling argument if the conversations weren't so extremely dull and derivative, with most of the articles written in LLMspeak. I see a lot of discussion and not a lot of substance; articles and discussions about AI have a much smaller chance of being compelling compared to any other technical subject posted on HN.by llmthrow0827
3/25/2026 at 10:34:18 AM
The signal-to-noise ratio seems worse than many other hypes but it's the general way hypes go.It's really hard to separate the wheat from the chaff at this point but I've been positively surprised by the relatively few articles sharing their more advanced workflow, lessons learnt which helps me to avoid the traps, patterns emerging that taught me something new (or at least validated approaches I tried on my own which worked). Gets tiresome to keep pace so I try to not fall for FOMO, and avoid experimenting too much to not get lost until I see a pattern emerging from different sources.
by piva00
3/24/2026 at 9:30:38 PM
Spot on take. The people I’ve noticed that say things like “it’s not useful” are the ones who are doing so little they can’t see the value.This isn’t to say there’s not hype. Just that if you’re not seeing big productivity gains you need to make sure you really are an outlier and not just surplus to requirements.
by bengale
3/24/2026 at 11:08:44 PM
I rarely come across people who flat out say "it's not useful". They exist, but IME they're the minority.Rather, I hear a lot of nuanced opinions of how the tech is useful in some scenarios, but that the net benefit is not clear. I.e. the tech has many drawbacks that make it require a lot of effort to extract actual value from. This is an opinion I personally share.
In most cases, those "big productivity gains" are vastly blown out of proportion. In the context of software development specifically, sure, you can now generate thousands of lines of code in an instant, but writing code was never the bottleneck. It was always the effort to carefully design and implement correct solutions to real-world problems. These new tools can approximate this to an extent, when given relevant context and expert guidance, but the output is always unreliable, and very difficult to verify.
So anyone who claims "big productivity gains" is likely not bothering to verify the output, which in most cases will eventually come back to haunt them and/or anyone who depends on their work. And this should concern everyone.
by imiric
3/25/2026 at 1:27:50 AM
"productivity" is a misnomer. Sort of. The things I'm building are all things I've had on the back burner for years. Most of which I never would have bothered to do. But AI lets me ignore that excuse and just do it.by hparadiz
3/25/2026 at 2:28:25 AM
The productivity comes from not having the startup costs. You don’t need to research the best way to do X, just verify that X works via tests and documentation. I find it still takes T hours to actually implement X with an agent, but if I didn’t know how to do X it eliminates that startup cost which might make it take 3T hours instead.The only downside is not learning about method Y or Z that work differently than X but would also be sufficient, and you don’t learn the nuances and details of the problem space for X, Y, and Z.
by ok_dad
3/25/2026 at 9:52:23 AM
> just verify that X works via tests and documentation.No, its verify that X approach is semantically correct, architecturally makes sense, design is valid and then add tests and documentation. Basically, 80% of the work.
by lenkite
3/25/2026 at 2:32:36 AM
I find it useful to use a brainstorming skill to teach me X Y Z and help me understand the tradeoffs for each, and what it'd recommend.I've learned about outbox pattern, eventual consistency, CAP theorem, etc. It's been fun. But if I didn't ask the LLM to help me understand it would have just went with option A without me understanding why.
by abustamam
3/25/2026 at 7:56:26 AM
> You don’t need to research the best way to do X, just verify that X works via tests and documentation."Just verify" is glossing over a lot of difficult work, though. It doesn't just involve checking whether the program compiles and does what you wanted—that's the easy part. You should also verify that the program is secure, robust, reasonably performant, efficient, etc. Even if you think about these things, and ask the tool to do this for you, generate tests, etc., you will have the same verification problem in that case as well. The documentation could also be misleading, and so on. At each step of this process there will likely always be something you missed, which considering you're not experienced in X, Y, or Z, you have no ability to properly judge.
You can ignore all of this, of course, which majority of people do, but then don't be surprised when it fails in unexpected ways.
And verification is actually relatively simple for software. In many other fields and industries verification is very impractical and resource intensive. It doesn't take a genius to deduce the consequences of all of this. Hence, the net effect of these tools is arguably not positive.
by imiric
3/25/2026 at 4:53:33 AM
> writing code was never the bottleneckThis is overly dismissive, there are many things that are possible now that weren't before because writing the code is no longer the bottleneck, like porting parts of the codebase from managed to unmanaged for teams with limited capacity. Writing code is about 1/3rd of the job. Another 1/3rd is analysis, which also benefits from AI allowing people who aren't very good at it to outperform. The final 1/3rd is-
> the effort to carefully design and implement correct solutions to real-world problems.
That's problem-solving - that part doesn't get sped up, and likely never will, reliably.
by kaiokendev
3/24/2026 at 11:14:01 PM
That's only because we're trying to not be too condescending.by SpaceNoodled
3/25/2026 at 2:30:48 AM
Really? On HN I see so many people AI naysayers who say either it's not useful or it's a net negative on productivity. Perhaps they are a minority, but they're certainly a vocal one.by abustamam
3/25/2026 at 1:49:14 AM
[dead]by AbanoubRodolf
3/24/2026 at 9:33:06 PM
This is really not true. There are stories of people who had no background in software engineering who now write entire applications using AI. And I have personally seen this happen.by amelius
3/25/2026 at 12:14:43 AM
Before AI, there were also stories of people who had no background in software engineering who wrote entire applications using their fingers. This was called "learning to be a software engineer".I don't mean to snipe at AI, because it really does seem to have set more people on the path of learning, but I was writing VB5 apps when I was 14 by copying poorly understood bits and pieces from books. Now people are doing basically the same but with less typing and everyone thinks it's a revolution.
by strken
3/25/2026 at 3:36:46 AM
I have never seen people learn how to be a software engineer in a weekend tho.by anon-3988
3/25/2026 at 5:29:12 AM
And neither do you today.There is more to it than "being able to make an entire application", which a novice could also have pulled off in a weekend 10 years ago.
by pamcake
3/24/2026 at 9:43:15 PM
Smart people can hit the ground running if they're freed from the need to first learn the intricacies of a new language. We're going to see an explosion in the number of people writing software as clever people who invested their time in something other than learning to program are now able to write software for themselves.by mikkupikku
3/25/2026 at 2:37:32 AM
This may be true, but define an entire application. Is it a CRUD app? Is it an app that scales to a thousand, ten thousand, a million users? Is it an app that is bug free and if not bug free, easy to fix said bugs with few to no regressions? Is it an app that is easy to maintain and add new features to without risk of breaking other stuff?I think it is genuinely impressive to be able to build one app with AI. But I haven't seen evidence that someone could build a maintanable, scalable app with ai. In fact, anecdotally, a friend of mine who runs an agency had a client who vibe coded their app, figured out that they couldn't maintain it, and had him rewrite the app in a way that could be maintained long term.
Again, I'm not an Ai detractor. I use it every day at work. But I've needed to harden my app and rules and such, such that the AI cannot make mistakes when I or another engineer is vibing a new feature. It's a work in progress.
by abustamam
3/24/2026 at 10:00:55 PM
What is not true, that "so many software engineers are having an identity crisis"?I don't believe they said that folks new to AI can't make impressive use of it. They did however say that senior folks with lots of scrappy and holistic knowledge can do amazing things with it. Both can be true.
by switchbak
3/25/2026 at 12:46:49 AM
I've seen people generate a lot of vibeslop with AI, but they didn't actually "Write entire applications using AI".They still have absolutely no clue how it works, so how could they "write entire applications"? They vibed it, but they certainly didn't write any of it, not one bit of it, and they're clueless as to how to extend it, upgrade it, and maintain it so that the AI doesn't make it a bloated monstrosity of AI patches and fixes and workarounds that they simply could never begin to understand.
They were also following a dozen youtube tutorials step by step, so even that part was someone else doing the thinking.
Yeah, these are the same guys constantly bugging me to help them figure something out.
by leptons
3/24/2026 at 9:40:33 PM
Its silly to say this but one such person is „pewdiepie”by pojzon
3/25/2026 at 1:06:53 AM
Agreed - another tool in the old tool pouch. I find it fascinating in that it provides insight into the role of language in intelligence. Certainly not AGI but makes ELIZA seem neolithic;)I am amazed at the incredible things it can do - only to turnaround and not be able to do a simple task a child can do. Just like people.
by strangattractor
3/25/2026 at 1:48:52 AM
The identity crisis observation is the most accurate thing in this thread. The engineers struggling most aren't struggling because AI is replacing their skills. They're struggling because AI is revealing which of their skills were incidental to their job versus central to it.A lot of software engineering career capital was built on knowing which obscure method to call, which Stack Overflow answer to trust, how to navigate a specific framework's quirks. That knowledge was genuinely hard to acquire and it was a real signal. Now it's table stakes. The career capital that survives is knowing why you'd make a particular architectural decision, how to tell if generated code is actually correct, what the error message is really telling you.
The road warrior framing is right. Those people internalized systems thinking across the whole stack over years. AI doesn't replace that — it makes it worth more, because now one person with that mental model can move faster than a team without it. The people who are "bored of AI" are often the people who already made that transition and stopped finding it novel. The people still anxious about it usually haven't yet.
by AbanoubRodolf
3/25/2026 at 2:28:45 AM
This! I've actually learned a lot about what I don't know by using AI. It made me dig into learning proper systems design, app architecture, etc.But at the same time the more I read about AI, the more I realize I need to learn about AI. Thus far I'm just using cursor and the Claude code extension alongside obra superpowers, and I've been quite happy with it. But on Twitter I see people with multiple instances of Claude code or open claw talking to each other and I don't even know how to begin to understand what's going on there. But I'm not letting myself get distracted — Claude code and open claw are tools. They could go away at any time. But systems thinking is something that won't go away. At least, that's my gambit.
by abustamam
3/25/2026 at 3:25:12 AM
It’s telling those people mostly talk about the complexity of the AI setup they’ve engineered to write code. Much more so than bragging about the software created by that process.by jimbokun
3/25/2026 at 4:56:54 AM
That's a good point, but I've seen a lot of interesting meta-setups as well (like visualizations of agents interacting with one another).Does it write good code? I dunno. But it looks cool, and I think interesting in its own right, even if it ends up being functionally useless.
by abustamam
3/24/2026 at 10:00:49 PM
The "AI Vampire", huh. Unironically, I've been feeling that way.Well, there was also a lot of unrelated things that happened as well around last November for me, but yes, getting into vibecoding for real was one of them, and man I feel physically drained coming back from work and going to use more AI.
Not sure what it is. I'm using AI personally to learn and bootstrap a lot of domain knowledge I never would have learned otherwise (even got into philosophy!, but man is it exhausting keeping up with AI. I would burn through a week's worth of credits in a day, and now I haven't vibe coded a week.
I think, I will chill. One day at a time.
by sigbottle
3/24/2026 at 10:03:57 PM
AI Vampire is from Steve Yegge, credit where it's due.My take is that it's similar to what Amber Case described in Calm Technology - with AI you are not steering one car, you're really steering three cars at the same time. The human mind isn't really designed for that.
I am finding that really structuring my time helps in terms of fighting back. And adopting an hours restriction, even if I could rage for 4 more hours, I don't. Instead I stop and go outside.
by _doctor_love
3/24/2026 at 9:53:14 PM
> I’m learning so much listening to intelligent people share their learnings.Me too. A key purpose of HN, and a bright time for that.
by QuantumGood
3/24/2026 at 11:47:35 PM
AI Vampire is so perfect. Ive never thought of it that way but its right there.by username135
3/24/2026 at 9:39:14 PM
absolutely. as a early/mid level SDET/SRE, I can move so fast on prototyping full good apps now. That style of thinking is serving me well, knowing about queues, docker, basic infra knowledge, good coding practices, is plenty to produce decent code. Interesting time to be laid off.AI makes a ton of bad decisions too and it's up to you to work with it. If I had the knowledge of the dangers hidden in things I'm developing, I'd move even faster
Was able to make a great full web app, which I think is hardened for prod but it had to be refactored to do so. Which it happily did.
It's really about asking the right questions, breaking down tasks, and planning now. I'm going to tackle a huge project, hoping to share it here.
by d675
3/25/2026 at 1:31:10 AM
Completely agree. It’s very telling that the majority of write ups on effect agentic coding are essentially summaries of software engineering best practices.by systemsweird
3/25/2026 at 8:38:23 AM
I am having an identity crisis that thankfully is sorted out by being senior and close to retirement than early career days.Since COVID I have seen teams scaled down, lots of custom development or devops/infra work work got replaced with SaaS and iPaaS cloud products, serverless/lambda, managed containers.
This is the next step.
Great that people feel more productive, unfortunely for many of them, us, more productivity means the C-suites can do some head count reduction yet again.
by pjmlp
3/24/2026 at 11:41:06 PM
If you have to really understand software development to be a good user of AI, we’re screwed. All the best users of AI we’ll ever have already exist I think.by deadbabe
3/25/2026 at 1:00:51 AM
That's a good point. Im a novice self taught developer that somehow pushed through and made a decent PM tool for the construction industry. It works, if your users aren't malicious or too demanding.Now I'm working on a second project, all with AI. I haven't written a single line. It works better than a non programmer would make because I knew what to ask for. But I'll admit I'm not learning anything.
by throwawaytea
3/25/2026 at 1:48:43 AM
Can't say the same. I've been super hands on with a C project. Really getting into the details of the event bus and how to make things performant. The AI is still writing 99% of the code but I'm being super strict about what I consider acceptable.by hparadiz
3/25/2026 at 11:05:53 AM
And when you get memory leaks and don’t know how to debug?by deadbabe
3/25/2026 at 12:08:36 PM
[dead]by dalmo3
3/24/2026 at 9:45:44 PM
Agreed, though I prefer "Fae Folk" to vampires.by gAI
3/24/2026 at 11:43:18 PM
If LLMs were vampires, they'd be better at counting, if they were fae, they'd be better at legalistic logic. :pby Terr_
3/24/2026 at 10:00:09 PM
Any thoughts on what the next generation of software devs is going to look like without as much manual experience?by djeastm
3/24/2026 at 10:21:32 PM
When C arrived, programmers wonder how software devs would look like when they won't have assembly experience.Then the same happened with languages that managed memory.
And with IDE that could refactor your code in a click and autocomplete API calls.
And with Stack Overflow where people copy/pasted code they didn't understand.
by eloisant
3/24/2026 at 11:37:22 PM
I reckon there's a limit to how long this abstraction can go on before not understanding underlying mechanisms will seriously hamstring you.by bGl2YW5j
3/25/2026 at 8:07:04 AM
Well how many times have we seen the S3 bucket set to public while the customer data piles up and leaks out to space.by sellmesoap
3/25/2026 at 4:02:57 AM
I think we're a long ways from that.But with that said, those who learn the underlying mechanisms will always be able to solve more problems than the folks who don't. When you know the lower pieces, your mental model tells you when and where the higher level pieces are likely to break. Legit superpower.
by wild_egg
3/25/2026 at 4:33:59 AM
> But with that said, those who learn the underlying mechanisms will always be able to solve more problems than the folks who don'tI would define that as being "seriously hamstrung"
by bluefirebrand
3/24/2026 at 10:55:57 PM
And over and over time proves that, when you need it, ASM or C or generals system knowledge was handy. One example, I am not a "Windows" or "NT" guy, mostly working in various Unixes and Linux in my professional career. I had a client who had battered every resource trying to fix some horrible freeze/timeout in their application. So I rolled up my sleeves, first search " is there dtrace on windows", found some profiling tools, found the process was stuck in some dumb blocking call loop, resource was unavailable, and the rest was history.So yeah i mean - who cares how it works - but also if you have experience in how things _do_ work you can solve problems other people cannot.
by calvinmorrison
3/25/2026 at 1:14:27 AM
It started before that. When assemblers came out, (some) programmers worried about losing touch with the machine if they didn't have to know the instructions in octal.by AnimalMuppet
3/24/2026 at 10:08:29 PM
Honestly, I think it will look pretty much like this one. There’s a lot of manual experience that the current generation doesn’t have.For example, I haven’t racked and cabled a server in over 15 years. That used to be a valuable skill.
I also used to know how to operate Cisco switches and routers (on the original IOS!). I haven't thought about CIDR and the difference between a /24 and a /30 since the year 2008. A class IP addresses, how do those work? What subnet am I on? Is thing running on a different VLAN? Irrelevant to me these days. Some people still know it! But not as many as in the past.
The late Dr. Richard Hamming observed that once a upon a time, "a good man knew how to implement square root in machine code." If you didn't know how to do that, you weren't legit. These days nobody would make such a claim.
So some skills fade and others rise. And also, software has moved in predictable cycles for many decades at this point. We are still a very young field but we do have some history at this point.
So things will remain the same the more they change on that front.
by _doctor_love
3/25/2026 at 1:59:16 AM
I am pretty sure network knowledge and all those things are still necessary for people running data centers and really big computers and I imagine we will build a lot more of that.Also anyone making a homelab has to know these stuff.
by a1o
3/24/2026 at 10:58:27 PM
> So some skills fade and others rise. And also, software has moved in predictable cycles for many decades at this point. We are still a very young field but we do have some history at this point.And there'll be a split too... like there's a giant divide between those mechanics who used to work on carburetors and the new gen with microcontrollers, injection systems, etc. People who think cars are 'too complicated' aren't wrong, but for someone who grew up in the injected era, i vastly prefer debugging issues over the canbus rather than snaking my ass around a hot exhaust to check something.
by calvinmorrison
3/25/2026 at 2:12:29 AM
And to take the analogy even further, I'm sure there will be a subset of people who develop really strong opinions about a particular toolchain or workflow. Like how we have people who specialize in 70s diesel trucks or 90-00s JDM sports cars, there'll likely be programmers who are SMEs at updating COBAL to Rust using Claude.by SpecialistK
3/24/2026 at 10:27:40 PM
A post supposedly about being bored of talking about AI. But psyche, it’s the same AI talking points. And psyche, the top comment is the same sentiment about how the truly skilled will finally have their time to shine.I don’t know if it’s the Universe delivering this farce or it’s the emergent LLM Singularity.
by keybored
3/24/2026 at 10:32:38 PM
> how the truly skilled will finally have their time to shine.That's not what I said. I said that those who are already shining, are now shining even brighter. Give a great craftsman a new tool and he will find a way to apply it. If it is valueless, he will throw it away.
For what it's worth, your comment is also an HN trope, the disaffected low-effort armchair keyboard warrior.
by _doctor_love
3/24/2026 at 10:37:01 PM
Expressing a negative sentiment is a trope now?by keybored
3/24/2026 at 10:58:02 PM
Keybored is a trending vibe, yeah.by Rapzid
3/25/2026 at 11:55:48 AM
Is anyone else bored of talking about posting with that keybored vibe? Hey, don’t get me wrong!—I’m not some heretic. I love the keybored posting vibe. Lots of sarcasm, complaining about American geopolitical decisions, recommending socialism on some venture capitalist/hacker forum, absolutely no hint of any Show HN bragging rights or any technical accomplishments whatsover, just lots of complaining in general (but I repeat myself). It’s fantastic and I could not go back to posting any other way. But why are talking about it so much? Why not just live it, live that beautiful vibe, and let it permeate our whole Internet persona—in fact just let it become water around us, like we are fish, something vital to our existence and wholly unquestioned until the dam breaks somewhere.by keybored
3/25/2026 at 7:00:14 AM
it's essentially the same argument Agile consultants made when faced with criticism about Agile, ... "you're not using it right"by ludicrousdispla
3/24/2026 at 9:46:48 PM
Isn't that scary though: A bunch of people are going to be forced to use a tool that keeps them ignorant and they absolutely won't know if it's doing correct things, to the point that as you retire, the next crop is going to be much less involved in knowing whats going on.It's what happened with the internet and computer usage. As Apple made it easier to get online with zero computer knowledge, suddenly we're electing people like donald trump.
by cyanydeez
3/24/2026 at 10:38:36 PM
To me, it is very scary. I know people who have sort of "outsourced" their critical thinking to chatgpt. So to me it's extra scary when I see it outside technical circles. They'll just believe whatever that generation of LLM tells them because it is doing it so confidentially and never question or check the information. Maybe I'm naive but I thought easier access to knowledge was supposed to make us more intelligent, not less.by scorpioxy
3/25/2026 at 1:45:09 AM
I don't remember exactly in which book's introduction Hannah Arendt mentioned this, but she pointed out that every time humanity learned a new skill that improved its efficiency in some capacity, that skill as well as adjacent skills diminished irrevocably.AI is the thing that for the first time can think better than us (or so at least some people believe) and is seen as an efficiency booster in the world of cognition and ideas. I'd think Hannah Arendt would be worried with what we are currently seeing and where we might be headed.
by vparseval
3/24/2026 at 11:00:58 PM
> Maybe I'm naive but I thought easier access to knowledge was supposed to make us more intelligent, not less.Turns out Lowtax was right and ahead of his time
by heavyset_go
3/24/2026 at 10:00:01 PM
Serious reply to this one: I truly don’t find it any more scary than what’s already taken place many times in human history.We have hundreds and thousands of years of history showing humans committing atrocities against each other well before the advent of computers, or even the introduction of electricity. So while the tool may become so ubiquitous that there’s no option not to engage with it, I don’t think it really fundamentally alters the dynamics of human behavior.
Some people are motivated by greed. Others are motivated by nobility. It really just comes down to which wolf they're feeding.
In terms of the tool keeping people ignorant, there’s a part I agree with and a part that I don’t. I think, in terms of information dissemination, AI is probably the autocrat’s wet dream in terms of finally being able to achieve real-time redefinition of reality. That’s pretty scary, and I’m not sure what to do about it.
On the other hand, people have always been free to not really learn their craft and to just sort of get by and make a living. That was true a thousand years ago, and it’s true today. There’s always somebody who can do really a high-quality job, but they’re very expensive, and then there's a vast population who will do a medium to terrible job for less money. You get what you pay for. There's a reason history is primarily written about people with power and wealth, they were the only ones with the means to do anything.
I don’t agree with the assertion about the internet and the election of someone like Donald Trump. Well before the internet existed, politicians were using communication mediums to influence things and get elected—whether it was the telegraph, the telephone, or the TV. JFK famously was the first TV president (notably, he didn't wear a hat).
These technologies simply give politicians more reach, and they may change the dynamics of how voters are persuaded. But what’s true today was true three hundred years ago: there’s the face of power that you see publicly, and then there’s what really happens behind the scenes.
by _doctor_love
3/24/2026 at 10:21:25 PM
> Serious reply to this one: I truly don’t find it any more scary than what’s already taken place many times in human historySpoken like someone who thinks they are going to be insulated from the fallout
by bluefirebrand
3/24/2026 at 10:29:10 PM
Many of us are fine with the fallout because we understand the net benefit to humanity is going to be similar to the previous waves of automation.Sure, it might hurt me personally. I'm not selfish enough to put that over what will be an incredibly empowering development for our species.
by solenoid0937
3/25/2026 at 12:21:22 AM
I don't believe for even a second that the net benefit to humanity is going to be positiveThis will be good for a handful of elites and no one else
by bluefirebrand
3/24/2026 at 10:34:09 PM
>They’re just flying with it right now.Where are they flying and why software has gone to shit?
Maybe this super stars programmers have to keep their reality breaking technology secret, but everything has not only degraded, but turned to absolute trash.
by heliumtera
3/24/2026 at 9:55:54 PM
Spot on, I am having the time of my life with AI, more fun than I've had in decades. But I was in the top 10% of engineering, and top 1% of the bits of engineering I do best, so it's easy for me to use AI to explore more ideas than I could have possibly explored by hand. And if I get replaced, cool bro, my investments are in compute, and compute's just getting started IMO.by LogicFailsMe
3/24/2026 at 10:52:30 PM
> For them, AI is the ultimate power tool.Yup
by hbarka
3/24/2026 at 11:16:32 PM
When all you've got AI, every problem looks like ... Uh, whatever hole an LLM's output goes into. A garbage can, ideally.AI seems great when you have no way of truly validating its output.
by SpaceNoodled