4/7/2026 at 7:01:32 AM
This article is spot on. I'm feeling the exact same way watching the industry aggressively promote the idea that it's safe to deploy unverified code just because an AI wrote the tests.We are playing with fire. If we keep treating "I don't read the code I ship" as a feature rather than a liability, it's going to cause a massive, real-world disaster. The resulting regulation will be so heavy that software engineering will end up needing a Bar Council or Medical Board just to ship a basic feature. We're cheering for a trend that is going to regulate us into a corner.
by distalx
4/7/2026 at 10:20:27 AM
But people code also cause real world disasters; most human programmers are terrible, never held accountable (they usually left a while ago), they cannot read/comprehend code (either) and cannot write tests (either). Only in a echo chamber like HN you can believe that the majority human programmers are any good / better than a 1bit 7B model ; they are not. Go out in the real world; most people are really really bad at what they do, including programmers.by anonzzzies
4/7/2026 at 2:58:48 PM
AI is coded by people.We have not reached the state in which AI creates AI.
by expedition32
4/7/2026 at 4:32:25 PM
A few very smart, highest paid people in the world yes. The rest... Well... I did not use All quantifiers in the original post.by anonzzzies