12/29/2025 at 4:55:28 PM
> The kinds of topic being discussed are not "is DRY better than WET", but instead "could we put this new behavior in subsystem A? No, because it needs information B, which isn't available to that subsystem in context C, and we can't expose that without rewriting subsystem D, but if we split up subsystem E here and here..."Hmm, sounds familiar...
Bingo knows everyone's name-o
Papaya & MBS generate session tokens
Wingman checks if users are ready to take it to the next level
Galactus, the all-knowing aggregator, demands a time range stretching to the end of the universe
EKS is deprecated, Omega Star still doesn't support ISO timestamps
by fogleman
12/29/2025 at 6:58:25 PM
Wngman.Number of softwares not supporting iso8601, TODAY (no pun), is appalling. For example, git (claiming compatibility, but isn’t).
by lkglglgllm
12/29/2025 at 11:48:33 PM
It's an infuriatingly accurate sketch. A team should usually have responsibility for no more than one service. There are many situations where this is not possible or desired (don't force your kafka connect service into your business logic service), but it's the ideal, IMO. More services mean more overhead. But someone read a blog post somewhere and suddenly we have four microservices per dev. Fun times.by tormeh
12/29/2025 at 8:14:16 PM
This is the kind of situation you get into when you let programmers design the business information systems, rather than letting systems analysts design the software systems.by bitwize
12/29/2025 at 8:27:39 PM
I don't think I've ever worked on a project that had "system analysts". You might as well say "this is what happens when you don't allow sorcerers to peer into the future". Best I've ever had are product managers who maybe have a vague idea of what the customer wants.by QuercusMax
12/29/2025 at 9:03:14 PM
Well, that's just the problem, innit. In decades past, systems analysts performed a vital function, viewing the business and understanding its information flows as a whole and determining what information systems needed to be implemented or improved. Historically, in well-functioning information-systems departments, the programmer's job was confined to implementation only. Programming was just a translation step, going from human requirements to machine readable code.Beginning in about the 1980s or so, with the rise of PCs and later the internet, the "genius programmer" was lionized and there was a lot of money to be made through programming alone. So systems analysts were slowly done away with and programmers filled that role. These days the systems analyst as a separate profession is, as you say, nearly extinct. The programmers who replaced the analysts applied techniques and philosophies from programming to business information analysis, and that's how we got situations like with Bingo, WNGMAN, and Galactus. Little if any business analysis was done, the program information flows do not mirror the business information flows, and chaos reigns.
In reality, 65% of the work should be in systems analysis and design—well before a single line of code is written. The actual programming takes up maybe 15% of the overall work. And with AI, you can get it down to maybe a tenth that: using Milt Bryce's PRIDE methodology for systems analysis and development will yield specs that are precise enough to serve as context that an LLM can use to generate the correct code with few errors or hallucinations.
by bitwize
12/30/2025 at 12:14:17 AM
I worked for a somewhat large bank that used to do this "system analysis" job at its beginnings. Don't recall how they called this process step, but the idea was the same. Besides the internal analysts, they used to hire consultancies full of experienced ladies and gentlemen to design larger projects before coding started.Sometimes they were hired only to deliver specifications, sometimes the entire system. The software they delivered was quite stable, but that's beyond the point. There sure were software issues there, but I was impressed by how those problems were usually contained in their respective originating systems, rarely breaking other software. The entire process was clear enough and the interfaces between the fleet of windows/linux/mainframe programs were extremely well documented. Even the most disorganized and unprofessional third-party suppliers had an easier time writing software for us. It wasn't a joy, but it was rational, there was order. I'm not trying to romanticize the past, but, man, we sure un-learned a few things about how to build software systems
by myth2018
12/30/2025 at 2:38:10 AM
Nobody wants to wait for those cycles to happen in the sorts of businesses that feature most prominently on HN. That flow works much better for "take existing business, with well defined flows, computerize it" than "people would probably get utility out of doing something like X,Y,Z, let's test some crap out."Now, later-stage in those companies, yes, part of the reason for the chaos is because nobody knows or cares to reconcile the big-picture, but there won't be economic pressure on that without major scaling-back of growth expectations. Which is arguably happening in some sectors now, though the AI wave is making other sectors even more frothy than ever at the same time in the "just try shit fast!" direction.
But while growth expectations are high, design-by-throwing-darts like "let's write a bunch of code to make it easy to AB test random changes that we have no theory about to try to gain a few percent" will often dominate the "careful planning" approach.
by majormajor
12/31/2025 at 3:57:18 AM
> Nobody wants to wait for those cycles to happen in the sorts of businesses that feature most prominently on HN.Bryce's Law: "We don't have enough time to do things right. Translation: We have plenty of time to do things wrong." Which was definitely true for YC startups, FAANGs, and the like in the ZIRP era, not so much now.
Systems development is a science, not an art. You can repeatably produce good systems by applying a proven, tested methodology. That methodology has existed since 1971 and it's called PRIDE.
> That flow works much better for "take existing business, with well defined flows, computerize it" than "people would probably get utility out of doing something like X,Y,Z, let's test some crap out."
The flows are the system. Systems development is no more concerned with computers or software than surgery is with scalpels. They are tools used to do a job. And PRIDE is suited to developing new systems as well as upgrading existing ones. The "let's test some crap out" method is exactly what PRIDE was developed to replace! As Milt Bryce put it: "do a superficial feasibility study, do some quick and dirty systems design, spend a lot of time in programming, install prematurely so you can irritate the users sooner, and then keep working on it till you get something accomplished." (https://www.youtube.com/watch?app=desktop&v=SoidPevZ7zs&t=47...) He also proved that PRIDE is more cost-effective!
The thing is, all Milt Bryce really did was apply some common sense and proven principles from the manufacturing world to systems development. The world settled upon mass production using interchangeable parts for a reason: it produces higher-quality goods cheaper. You would not fly in a plane with jet engines built in an ad-hoc fashion the way today's software is built. "We've got a wind tunnel, let's test some crap out and see what works, then once we have a functioning prototype, mount it on a plane that will fly hundreds of passengers." Why would a company trust an information system built in this way? It makes no sense. Jet engines are specced, designed, and built according to a rigorous repeatable procedure and so should our systems be. (https://www.modernanalyst.com/Resources/Articles/tabid/115/I...)
> Which is arguably happening in some sectors now, though the AI wave is making other sectors even more frothy than ever at the same time in the "just try shit fast!" direction.
I think the AI wave will make PRIDE more relevant, not less. Programmers who do not upskill into more of a systems analyst direction will find themselves out of a job. Remember, if you're building your systems correctly, programming is a mere translation step. It transforms human-readable specifications and requirements into instructions that can be executed by the computer. With LLMs, business managers and analysts will soon be able to express the inputs and outputs of a system or subsystem directly, in business language, and automatically get executable code! Who will need programmers then? Perhaps a very few, brilliant programmers will be necessary to develop new code that's outside the LLMs' purview, but most business systems can be assembled using common, standard tools and techniques.
Bryce's Law: "There are very few true artists in computer programming, most are just house painters."
The problem is, and always has been, that all of systems development has been gatekept by programmers for the past few decades. AI may be the thing that finally clears that logjam.
by bitwize
12/31/2025 at 5:43:12 PM
In the construction world, it's basically the separation between architects and builders.Sure you can definitely build things and figure out things along the way. But for any sufficiently complex project, it's unlikely to yield good results.
by seec
12/30/2025 at 6:59:23 AM
IMO programs are 90% data or information, and modern software vastly underutilizes that concept.If you know what data you need, who needs it, and where it needs to go, you have most of your system designed. If you just raw dog it then stuff is all over the place and you need hacks on hacks on hacks to perform business functions, and then you have spaghetti code. And no, I don't think domain modeling solves it. It often doesn't acknowledge the real system need but rather views the data in an obtuse way.
by array_key_first
12/30/2025 at 5:35:41 PM
This!Per Fred Brooks: "Show me your flowcharts, but keep your tables hidden, and I shall continue to be mystified. Show me your tables, and I won't need to see your flowcharts; they'll be obvious."
It's telling that PRIDE incorporates the concept of Information Resource Management, or meticulous tracking and documentation of every piece of data used in a system, what it means, and how it relates to other data. The concept of a "data dictionary" comes from PRIDE.
by bitwize
12/30/2025 at 2:36:15 AM
No, this is the situation you get into when you have programmers build a system, the requirements of that system change 15 times over the course of 15 years, and then you never give those programmers time to go back and redesign, so they keep having to stack new hacks and kludges on top of the old hacks and kludges.Anyone who has worked at a large company has encountered a Galactus, that was simply never redesigned into a simple unified service because doing so would sideline other work considered higher priority.
by wavemode