alt.hn

2/26/2026 at 9:51:48 AM

Technical Excellence Is Not Enough

https://raccoon.land/posts/technical-excellence-is-not-enough/

by bo0tzz

2/26/2026 at 1:25:57 PM

I find OP's communication style abrasive and off-putting, which tracks with them saying they've been coached on this, and found that advice lacking.

Maybe it's still insufficient advice, but it hasn't worked for them at least in part because they haven't figured out how to apply it.

From the post, I see low empathy and an air of superiority, (perhaps earned by genuinely being smarter than their peers-- doesn't make it more attractive).

That's going to cause friction because a team is a _social_ construct.

by james_marks

2/26/2026 at 1:58:00 PM

That's because it was generated by an LLM.

by mmsc

2/26/2026 at 2:01:35 PM

I simply cannot believe people in this post are discussing this as anything other than a complete bot job. Pure clanker vomit.

by Rapzid

2/26/2026 at 2:15:38 PM

I realize it's been "written" by an LLM, but the content could have been written by someone I know. It's eerie how this person thinks exactly the same way. It's never their fault, always the others', and they are always obviously right and no amount of arguing can change their mind.

by sjamaan

2/26/2026 at 2:44:37 PM

"Write an essay about struggling to change a software org that doesn't want to change. Make me the hero. Post it at 1am so it looks like I was up late suffering with the burden of what I know."

This is unfortunately the world we are in now.

by Rapzid

2/26/2026 at 2:41:03 PM

This is not a politically correct thing to say but there is a class of neurodiverse software developers who display these characteristics and I suspect the author belongs to this group.

Frankly, reminds me of Michael O'Church

by JCDenton2052

2/26/2026 at 1:55:44 PM

Yeah, a lot of the examples made me think "wait, there's something else going on there, right?", which would make sense if the author has difficulty communicating or negotiating their proposals.

In the first example, for example, they suggested a new metric to track added warnings in the build, and then there was a disagreement in the team, and then as a footnote someone went and fixed the warnings anyway? That sounds like the author might be missing something from their story.

by MrJohz

2/26/2026 at 2:16:37 PM

> In the first example, for example, they suggested a new metric to track added warnings in the build, and then there was a disagreement in the team, and then as a footnote someone went and fixed the warnings anyway? That sounds like the author might be missing something from their story.

I do not find anything missing here. This is how things often plays out in reality. Both your retelling of it and what was actually written in the article.

Your retelling: Some people agree and some disagree with new metric. That is completely normal. Then someone who agree or want to achieve the peace or just temporary does not feel like doing "real jira" tasks fixes warnings. Team moves on.

Actual article: the warnings get solved when it becomes apparent one of them caused production issue. That is when "this new process step matters" side wins.

by watwut

2/26/2026 at 6:27:25 PM

I'm referencing the footnote where the author says that the discussion caused one team member to go and fix the issue. The warnings causing a production issue is, I think, a complete hypothetical.

What this story is missing is an explanation for why people were disagreeing. Like, why is someone not looking at warnings? Is it that the warnings are less important than the author understands? Is it that the warnings come from something that the team have little control over? And the solution the author suggests - would it really have changed anything if they already weren't looking at warnings? The author writes as if their proposal would have fixed things, but that's not really clear to me, because it's basically just a view into whether the problem is getting worse, which can be ignored just as easily as the problem itself.

by MrJohz

2/26/2026 at 7:44:36 PM

Someone hacked his site or something, so I cant get back. But, I thought you mean situation in one of the first paragraphs where the team started take some issue seriously after actual problem.

And honestly, I have seen people disagree and fight literal standard changes like "lets have pipeline that runs tests before merge" or "database change must go through test environment before being sent over".

It is perfertly possible and normal for people to fight change and be wrong without there being grave smart missing reason. I have no problem to m trust the author that he was simply right in hindsight.

If you ever tried to improve processes or project with persistent issues, the problems author described are entirely believable. The author does not know what to do in that situation, but he described the usual dynamic pretty accurately.

by watwut

2/26/2026 at 1:33:10 PM

The first two sentences

> Organizations don't optimize for correctness. They optimize for comfort

...do I need to say it?

by armchairhacker

2/26/2026 at 1:47:19 PM

> One number, never measured before. It doesn't change rules or add warnings, just makes the existing count visible.

Stopped here. That pattern.

I recognize this pattern from this AI "companion" my mate showed me over Christmas. It told a bunch of crazy stories using this "seize the day" vibe.

It had an animated, anthropomorphized animal avatar. And that animal was an f'ing RACCOON.

by Rapzid

2/26/2026 at 2:26:20 PM

LLMs originally learned these patterns from LinkedIn and the “$1000 for my newsletter” SEO pillions. Both accomplish a goal. Now that's become a loop.

There is a delayed but direct association between RLHF results we see in LLM responses and volume of LinkedIn-spiration generated by humans disrupting ${trend.hireable} from coffee shops and couches.

// from my couch above a coffee shop, disrupting cowork on HN. no avatars. no stories. just skills.md

by Terretta

2/26/2026 at 1:52:37 PM

You are absolutely right!

- It is not X. It is Y.

- X [negate action] Y. X [action] Z.

The titles are giveaways too: Comfort Over Correctness, Consensus As Veto, The Nuance, Responsibility Without Authority, What Changes It. Has that bot taste.

If you want I can compile a list of cases where this doesn't happen. Do you want me to do that?

by bossyTeacher

2/26/2026 at 2:07:07 PM

As someone who thinks very much like TFA, I often write like that. I swear I'm not a bot.

by sgarland

2/26/2026 at 2:08:42 PM

Maybe fix your writing then. This is not good writing.

by grey-area

2/26/2026 at 2:31:28 PM

Neither is Vonnegut's (which your short, choppy sentences reminded me of), but he was a very successful and beloved author. I'm in no way comparing myself to Vonnegut, but my point is just because it doesn't appeal to you, it doesn't mean it isn't good.

Writing is art. Does it get the intended point across? Does it resonate with the reader? Does it make them feel something? Then it is good.

by sgarland

2/26/2026 at 3:31:05 PM

I disagree on Vonnegut. Most human authors at least have a voice, even if you don't like it it's recognisable and theirs, and I would rarely think to criticise that, it makes the writing come alive. If you truly write like an LLM (there is little evidence here of that) it would not be the same.

LLMs serve up a sort of bland pap with sugary highs of excitement which resembles a cross between manic advertising copy and a breathless teenager who's just discovered whatever subject they're talking about. They also sometimes confabulate and generate text which is at best tangential and at worst completely misleading.

It's exhausting and if you haven't carefully read what they generate (which most people clearly have not), you should not expect another human to read it.

Just as an interesting taste, here is my copy above rewritten to sound even more EXCITING and ENGAGING.

"They deliver a horrifying concoction – a sickly sweet, manufactured echo of thought, a grotesque blend of relentless advertising whispers and the manic, unearned enthusiasm of a teenager just discovering a world they don't understand! But the truly chilling thing is this: they fabricate. They weave elaborate lies, constructing text that’s not just tangential, but actively, dangerously misleading!

It’s a psychic assault, a draining vortex of intellectual despair! And if you haven’t wrestled with every single word, dissected it, exposed its flaws – and frankly, I suspect most haven’t – then don’t dare expect anyone else to salvage this wreckage! This is not a passive observation; it’s a desperate plea against a future where genuine thought is suffocated by the cold, sterile logic of a machine! We must guard against this, or we risk losing everything!” -- gemma3:4b

by grey-area

2/26/2026 at 5:22:25 PM

I don't disagree with your take on what how LLM copy is awful; I just disagree that this was written by an LLM. For example, this paragraph at the end:

> If you're in this position (relied upon, validated, powerless), you're not imagining it. And it's not a communication problem. "Just communicate better" is the advice equivalent of "have you tried not being depressed?"

I've seen "you're not imagining it" countless times from LLMs, but always as the leading sentence in the paragraph; for something like the above, they tend to use em-dashes, not parentheses.

FWIW, Grammarly's AI Detector thinks that 17% of it resembles LLM output, and ZeroGPT thinks that 4.5% of it resembles LLM output.

by sgarland

2/26/2026 at 9:30:41 PM

Your comments don't read like LLM-slop to me.

An occasional "it's not X, it's Y", rule of three, or em-dash isn't atypical nor intrinsically bad writing. LLM-slop stands out because of the frequency of those and other subliminal cues. And LLM-slop is bad writing, at least to me, because:

- It's not unique (like how generic art is bad compared to distinct artstyles)

- It's faux-authentic ("how do you do, fellow kids?")

- It's extremely shallow in information. Phrases like "here's the kicker" and "let that sink in" are wasted words

- The meaning is "fuzzy". It's hard to describe, but connotations and figurative language are "off" (inconsistent to the larger idea? Like they were picked randomly from a subset of acceptable candidates...); so I can't get information from them, and it's hard to form in my mind what the LLM is trying to convey (perhaps because the words didn't come from a human mind)

- It doesn't always have good organization: some parts seem to go on and on, high-level ideas drift, and occasionally previous points are contradicted. But I suspect a plan+write process would significantly reduce these issues

by armchairhacker

2/26/2026 at 2:23:52 PM

It used to be. That's why LLMs adopted it. How do you think they got their preferences? A Magic 8 Ball?

by quotemstr

2/26/2026 at 3:18:23 PM

It was okay writing in the context of marketing. A normal person never wrote like that.

by dw_arthur

2/26/2026 at 2:59:19 PM

Why is it bad writing?

by watwut

2/26/2026 at 2:15:04 PM

> I find OP's communication style abrasive and off-putting

Your comment is hilarious on a meta-level: it's an example of exactly the sort of socially-mediated gatekeeping the author of the article (machine or human, I don't care) criticizes. It is, in fact, essential to match authority and responsibility to achieve excellence in any endeavor, and it's a truth universally acknowledged that vague consensus requirements are tools socially adept cowards use to undermine excellence.

Competent dictatorship is effective. Look at how much progress Python made under GVR. People who rail against hierarchy and authority, even when deployed correctly, are exactly the sort of people who should be nowhere near anything that requires progress.

Imagine running a military campaign by seeking consensus among the soldiers.

by quotemstr

2/26/2026 at 3:31:05 PM

Consensus works in a Democracy because the best thing the government can do to help people is usually nothing.

by mapontosevenths

2/26/2026 at 2:42:15 PM

> Look at how much progress Python made under GVR.

Or, you know, Linus Fucking Torvalds. If you were carrying the success or failure of most of the world's digital infrastructure on your shoulders, you also might be grating to some.

by sgarland

2/26/2026 at 1:53:40 PM

OP's dismissiveness of soft skills is a big red flag. Unless you're a solo dev, software development is a social activity, and understanding the social dynamics is key to effecting change.

Your efforts to improve quality could be vetoed by your coworkers for a variety of reasons: they don't care, they don't trust your judgement, they see other things as a higher priority... the list goes on and on. Some of these things can't be changed by you, but some can, and that's where the soft skills come into play.

by pverheggen

2/26/2026 at 1:59:07 PM

Side note, this is why I'm not that worried even if AI becomes even better at writing code. The only times I've spent "too long" on features, are times where I basically had an empty ticket. I need to find the right people to talk with, figure out requirements, iterate on changing requirements etc.

That's only marginally sped up even if you could generate the code with a click of a button.

This was somehow related to the "social activity" part :D

by tripledry

2/26/2026 at 2:10:22 PM

100%. The thing I'm currently working on has been a pain probably 80% because the work was underspecified and didn't take a bunch of legacy concerns into account and probably 20% because of nature of the code itself.

If it was better specified I'd be done already, but instead I've had to go back and forth with multiple people multiple times about what they actually wanted, and what legacy stuff is worth fixing and not, and how to coordinate some dependent changes.

Most of this work has been the oft-derided "soft skills" that I keep hearing software engineers don't need.

by asa400

2/26/2026 at 2:06:20 PM

Where did they dismiss soft skills? The point is that every improvement is met with "just get better soft skills bro" dismissal, which in reality has nothing to do with soft skills. I've met this firsthand.

by wiseowise

2/26/2026 at 2:55:28 PM

Their direct complaint is the "just get better soft skills bro" advice, but it's dismissed indirectly:

> The "soft skills" framing is wild. You're supposed to learn to communicate your way out of a structural problem. Like taking a public speaking class to fix a broken org chart.

If learning to communicate well wouldn't fix a structural problem, then communicating well wouldn't fix it either.

by pverheggen

2/26/2026 at 3:13:13 PM

That’s the point, though? That you’re being gaslit by “you’re not communicating well” when no amount of good communicating would fix the underlying issue or you being dismissed?

by wiseowise

2/26/2026 at 2:23:24 PM

They do not dismiss soft skills. But, they do not know how to play the politics and were given bad advice. I would even say that their observations are entirely correct, they accurately described how teams function. What they do not know is how to influence people.

Bad advice given to them:

> The standard advice is always "communicate better, get buy-in, frame it differently." [...] The advice for this position is always the same: communicate better. Get buy-in. Frame it as their idea. Pick your battles. Show, don't tell.

That sort of naive kindergarten advice is how people want things to work, but how they rarely work. Literally the only functional part of it is the "pick your battles" part. That one is necessary, but not sufficient. The listed advice will make you be seen as nice cooperative person. It is not how you achieve the change.

So OP comes to the "the problem isn't communication. It's structural." conclusion.

by watwut

2/26/2026 at 2:26:42 PM

The point is that if you unite authority and responsibility in the same individual, you can move fast and confidently because you don't drain people's time and energy by making them "influence people". In a healthy organization, responsible people act and are held to account by their results. Democracy is a choice, not an obligation.

You're right that organizations do often become consensus-driven. It's a failure mode, not something to which we should aspire. And we certainly shouldn't tell people to deal with a shortfall of authority in an organization by becoming social slime balls that get their way through manipulating emotions and not atoms. People who advise doing this ruin good technologists by turning them into middling politicians.

"Disagree and commit" is a good thing. Escalating disagreement to a "single threaded owner" for a quick decision is a good thing. It avoids endless argumentation and aligns incentives the right way. Committees (formal or not) diffuse responsibility. Maturity is understanding that hierarchy is normal and desirable.

by quotemstr

2/26/2026 at 2:37:36 PM

Nailed it. I often cite Admiral Rickover (USN), who among many other things, said this:

"Responsibility is a unique concept... You may share it with others, but your portion is not diminished. You may delegate it, but it is still with you... If responsibility is rightfully yours, no evasion, or ignorance or passing the blame can shift the burden to someone else. Unless you can point your finger at the man who is responsible when something goes wrong, then you have never had anyone really responsible."

He was seen as an asshole who was difficult to work with, and while that's true, it's also true that without his leadership, it's doubtful the Navy would have launched their nuclear power program, nor would it have been as successful. He ran a dictatorship, with an absurd amount of direct reports, very little middle management, and it was wildly successful.

> "Disagree and commit" is a good thing.

> In a healthy organization, responsible people act and are held to account by their results.

The first is only true if the second is also true, but I'm sure you know that.

by sgarland

2/26/2026 at 2:52:37 PM

If you unify authority and responsibility into a single person, that one person can both both move fast and super slow. They can prevent any change at all or cause unreasonable amount of change.

Literally everyone else who wants to change something or keep it the same has to play politics as they try to influence this one person. But also, practically speaking, this one person still have to do a lot of politics. A single team leads with great power still struggle to enact the change. They encounter both open and hidden opposition. Their opposition is even frequently right. They also encounter misunderstanding, passive aggression, seeming compliance, passivity. Or simply people fully agreeing, being onboard and still doing things the old way out of intertie or organization pressures.

> You're right that organizations do often become consensus-driven.

I did not said that at all. I did not even said that it is bad when it happens. I said that the usual advice OP was given makes people feel good, but it is a bad advice for achieving the change.

by watwut

2/26/2026 at 2:02:49 PM

There is no OP. It’s ai slop.

by avazhi

2/26/2026 at 12:55:24 PM

I worked for 7 years in a place where my technical insight slowly turned into questioning my decisions and expertise (this was after being 3 years in tech lead and 2 years in staff engineer role). Sometimes the solution is just to walk away

by eithed

2/26/2026 at 1:06:58 PM

Yeah that's a painful process, as I know from experience. What do you think is the reason for the gradual shift?

by loevborg

2/26/2026 at 1:56:25 PM

I think when you are new with good ideas, you are judged against average. If you are above average, you are listened to.

As years pass, you are judged against the standard you set, and if you do not keep raising this standard, you start being seen as average, even if you are performing the same when you joined.

I've seen this play out many, many times.

When an incompetent person is hired, even if issues are acknowledged, if they somehow stay, the expectations from them will be set to their level. The feedback will stop as if you complain about same issues or same person's work every time, people will start seeing this as a you problem. Everyone quietly avoids this, so the person stays.

When a competent person is hired, it plays out the same. After 3/5/10 years, you are getting the same recognition and rewards as the incompetent person as long as you both maintain your competency.

However, I've seen (very few) people who consistently raised their own standards and improved their impact and they've climbed quickly.

I've seen people lowering their own standards and they were quickly flagged as under-performers, even if their reduced impact was still above average.

by elevatortrim

2/26/2026 at 6:11:50 PM

I agree with this summary to a degree. Additional problem arises when you simply cannot raise the standard as you lack political influence to do so. As it is said in the article - sometimes companies are comfortable with status quo, irregardless of the problems, whether they are technical or not. Another issue stems when product, rather than looking at tech as a partner in pursuit of common goal starts to see it as an underling.

by eithed

2/26/2026 at 1:15:49 PM

While I can't say that I observe that kind of radical shift for myself, one of the reasons I still can see something similar is AI development.

Basically manager asks me something and asks AI something.

I'm not always using so-called "common wisdom". I might decide to use library of framework that AI won't suggest. I might use technology that AI considers too old.

For example I suggested to write small Windows helper program with C, because it needs access to WinAPI; I know C very well; and we need to support old Windows versions back to Vista at least, preferably back to Windows XP. However AI suggest using Rust, because Rust is, well, today's hotness. It doesn't really care that I know very little of Rust, it doesn't really care that I would need to jump through certain hoops to build Rust on old Windows (if it's ever possible).

So in the end I suggest to use something that I can build and I have confidence in. AI suggests something that most internet texts written by passionate developers talk about.

But manager probably have doubts in me, because I'm not world-level trillion-dollar-worth celebrity, I'm just some grumpy old developer, so he might question my expertise using AI.

Maybe he's even right, who knows.

by vbezhenar

2/26/2026 at 1:53:15 PM

It seems like a quote clear cut case?

You mention the tradeoffs between rust. Including the high level of uncertainty and increased lead time as you need to learn the language.

The manager, now having that information, can insist on using rust, and you get er great opportunity to learn rust. Now being totally off the hook, even if the project fails, as you mentioned the risks.

by tossandthrow

2/26/2026 at 2:08:11 PM

“Truly I tell you,” he continued, “no prophet is accepted in his hometown."

- Luke 4:24

It's why people often trust consultants over the people inside the organization. It's why people often want to elect new leaders even if the current leaders are doing a decent job.

The baby almost always gets thrown out with the bath water.

https://en.wikipedia.org/wiki/Don't_throw_the_baby_out_with_...

by trentnix

2/26/2026 at 6:17:49 PM

I find this hilarious given that I've experienced it from both viewpoints - 1. consultant implemented their half baked solution that continued to bite us for my tenure and imo was completely unmaintainable; how were they able to convince leadership about their ideas - sometimes it's just snake oil 2. In new place am preaching certain things to people that do listen and seem to want to do it - it makes me a bit uncomfortable and to a degree scary in how easily you can find acolytes. They do validate my suggestions, ask questions and most importantly - think, so I am hopeful that I won't turn out to be a false prophet

by eithed

2/26/2026 at 7:43:44 PM

I've also played both roles myself at times. I've been the wise consultant. And I've been the Cassandra that nobody would listen to. My wisdom was never as good as presumed when I was the consultant. And my wisdom was far better than was assumed when I as the Cassandra.

by trentnix

2/26/2026 at 6:37:05 PM

The prevalent pattern I can see is making things mundane. Capabilities that you are enabling are no longer something that only you could do, was you expertise there at all? Things running smoothly is something that is granted. Doing your job well becomes unexceptional

by eithed

2/26/2026 at 1:18:02 PM

>Ignoring it costs more later, but later is someone else's problem

Given the standard advice to job hop every 1-3 years, and the intern/coop work pattern of semester long stints, is this not just a structural consequence?

Do you gain competitive advantage as a company with longer tenures? Or shorter, even?

Or is it an attitude problem, compare with old people planting shade trees:

“Codebases flourish when senior devs write easily maintainable modules in whose extensions they will never work”

by mikrl

2/26/2026 at 1:11:39 PM

>If you're in this position (relied upon, validated, powerless), you're not imagining it. And it's not a communication problem. "Just communicate better" is the advice equivalent of "have you tried not being depressed?"

How about "have you tried unionizing?" Because the common theme here is lack of respect which is ultimately limited by your own bargaining power. That means it's only your individual value against the collective will of the company, and the individual is going to lose that fight more often than not (with very rare exceptions for extremely talented and smart people who won the life lottery who are smarter than everyone at a company).

by hnthrow0287345

2/26/2026 at 1:27:09 PM

Hard to unionize a digital industry, very easy to find someone willing to take lower pay and “scab” when location is not truly a factor. Not to say impossible, but software development is one of few trades that just by the nature of being digital is pretty hard to unionize.

by lukewarmdaisies

2/26/2026 at 1:29:44 PM

If we believe that, then the real question/answer for the OP's worries is that software development is in a race to the bottom, and then the advice becomes "have you tried switching to hard-to-automate-and-outsource industry?" because you are certainly never going to get respect by volunteering to be paid less just to remain competitive with cheaper workers

by hnthrow0287345

2/26/2026 at 1:48:25 PM

My advice broadly would be to find some way to be difficult to replace, and that’s one way to do it.

by lukewarmdaisies

2/26/2026 at 1:19:00 PM

I hate seeing the idea that unionization is the answer. I grew up in South GA. Every single time that a corporation didn’t want to deal with a union, they just picked up and left.

by raw_anon_1111

2/26/2026 at 1:39:01 PM

And if everywhere they went would do the same thing, then they wouldn't be able to leave. Too bad people have been convinced that unions are bad.

by bdavisx

2/26/2026 at 1:55:55 PM

They left for Mexico…

How hard do you think it is to get cheaper developers from LatAM?

by raw_anon_1111

2/26/2026 at 2:11:14 PM

All the corporations that left Georgia over unions left for Mexico?

by Rapzid

2/26/2026 at 2:35:25 PM

Not all of them M&M Mars left for Mexico, Firestone left for Alabama.

by raw_anon_1111

2/26/2026 at 12:47:18 PM

Ouch I don't want to work there! It seems extreme. A decent place to work let's you do your thing. There will be guardrails. But my current job my boss has never told me not to do something. Getting the time to do it is another story and there are solutions. Sometimes picking the battle and lettung it go. Sometimes driving a decision and agreement. But if you do that people like it. And I work somewhere pretty well mocked on HN and Reddit etc. But they are good.

Other places I worked it is usually another engineer throwing a spanner in the works. Smaller companies have a lot of pets in the code and architecture. But if you avoid the pets you can change things.

by medi8r

2/26/2026 at 2:18:52 PM

> my current job my boss has never told me not to do something. Getting the time to do it is another story

I’m confused. The polite way to say no at work is to make it about not having time.

by dogleash

2/26/2026 at 9:20:40 PM

Yeah the other story I refer to isn't using time as an excuse to block architectural improvements though. We get time for both new features and tech debt.

But if your idea blows out the quarter it had bettet be game changing!

by medi8r

2/26/2026 at 2:07:46 PM

> Nobody disagrees with the technical argument

That's a very strong foundational claim right at the start. And in my experience, a completely false one. Which makes the whole argument that follows it completely unsound.

Also, the author seems to treat the terms "consensus" and "buy-in" as synonymous. They're not, and this distinction can make a huge difference in terms of healthy teams can operate. Patrick Lencioni covers this well in his classic book, "Five Dysfunctions of a Team".

by tpoacher

2/26/2026 at 2:43:12 PM

> Also, the author seems to treat the terms "consensus" and "buy-in" as synonymous.

Can you explain more? I'm not familiar with that distinction, nor that book.

EDIT: I asked ChatGPT, and it came up with this [0]. Please let me know if it's accurate (I don't necessarily dislike LLMs, I just think they're wildly oversold, and also value human input).

0: https://chatgpt.com/share/69a05ce2-95e4-8006-ae56-bd51472894...

by sgarland

2/26/2026 at 1:36:58 PM

> Authority matching responsibility. That's the only fix I've seen work.

So, if I understood correctly, complaining that his architectural advice for other teams/people was constantly ignored, and his solution is the same thing he was complaining about.

ie The teams he was advising also thought authority should match responsibility - and they did want they wanted and ignored him?

by DrScientist

2/26/2026 at 1:08:41 PM

Technical excellence is often overlooked by the MBA groups. They will simply walk into a project, pick something perfectly functional and ask you to tear it down for no fucking reason other than to demonstrate "they add value" to the company. They will be really good with the slides and graphs and that's what is visible to management anyway.

Not the framework you developed. Not the fact that your work powers millions of users. To them, you're just a replaceable worker bee. You are only needed when something breaks. Architectural decisions are made by anecdotal experiences by them and it's just stone, paper, scissors all over again.

And when shit blows up right in their faces, it will not be about their judgement or lack thereof - it will be about how you didn't communicate about the issue properly. It will always be you who will be under the bus. And then the bunch of these clowns go and vibe code some stupid-ass product and sell it to gullible investors "wHo NeEds EnGiNeErs?"

And then you read about how 1000s of users' information went public all over the internet post their launch...the very next day.

/endrant

by neya

2/26/2026 at 12:41:28 PM

> "Discuss before shipping" sounds reasonable. In practice, when you're discussing with people who resist the category of change you're proposing, the outcome is predetermined. The discussion isn't evaluation, it's a veto dressed as process.

I literally had this discussion with my boss yesterday. I spent time writing up what I already knew to be true (we have systemic issues which are unsolved, because we only ever fix symptoms, not root causes), replete with 10+ incidents all pointing to the same patterns, and was told I need to get the opinions of others on my team before proceeding with the fixes I recommended. “I can do that, but I also already know the outcome.”

> Responsibility Without Authority

This. So much this. Every time I hear someone excitedly explain that their dev teams “own their full stack,” I die a little inside. Do they fix their [self-inflicted] DB problems, or do they start an incident, ask for help, and then refuse to make the necessary structural changes afterwards? Thought so.

by sgarland

2/26/2026 at 1:37:02 PM

Those short sentences make it seem like it was written by ai, so jarring to read.

by tofukant

2/26/2026 at 2:30:28 PM

I hate AI writing as much as anyone, but cringe is orthogonal to correctness.

by quotemstr

2/26/2026 at 1:36:40 PM

That is why I think technical excellent people should be in charge. They are the ones able to see the trade offs. They can see who is actually doing great work. Think Linus, Guido, Larry Wall, or Carmack.

by MyHonestOpinon

2/26/2026 at 2:01:06 PM

Correct, but it becomes very hard to be in charge and stay technically excellent. The higher you are, the harder it is.

by elevatortrim

2/26/2026 at 2:05:49 PM

Leadership is not the same thing as management. Maybe some day the OP will get training data to add that concept to its latent vector space.

by leothecool

2/26/2026 at 1:52:41 PM

> The gap between responsibility and authority is where burnout lives.

Insert fire writing gif here.

by wiseowise

2/26/2026 at 1:08:14 PM

OP’s experience is all too common. If he keeps trying to do the right thing, he’s going to run afoul of “the no assholes rule”.

by k33n

2/26/2026 at 2:23:44 PM

You have to be able to pick your battles. Sometimes people are in the wrong teams. Sometimes they are just assholes who think they are always right. Too often the "right thing" is subjective.

by JCDenton2052

2/26/2026 at 1:52:51 PM

That's exactly what happens in some organizations. I couldn't believe it the first time I saw it, but it is what it is. And the reason is some bosses are addict to consensus. Infuriating but there's really no other option than shrugging off the problems, waiting for staff changes or looking for another job.

by narag

2/26/2026 at 1:52:48 PM

I suspect the author has little to no experience running a commercial organisation.

Business outcome comes first, and it is only rarely aligned with technical excellence. Closing a deal might involve making an unreasonable promise, and implementing it might not require more than an ugly hack, so you go with the ugly hack and make the money.

Comfort could be important but many people don't perform well when comfortable, so the organisation has to add some degree of confusion and pressure to keep them at a productive equilibrium where they don't fall into either apathy or burst into flames.

And yes, the boss decides, not because they are especially accountable or responsible, but because the power comes from ownership. In some organisations this is veiled and workers get a say most of the time, but in a pinch it'll be the higher-ups that actually have that power.

by cess11

2/26/2026 at 2:44:34 PM

Normally when you give an unreasonable promise, or have to implement ugly hack then it is known about and explicit that is what is happening. The problems come when you make a unreasonable promise, but no one knows that.

by UK-Al05

2/26/2026 at 1:35:47 PM

> Ignoring it costs more later, but later is someone else's problem.

and then the blame could be shifted to the future generations, it's their incompetence after all.

> Correctness wins when the cost of ignoring it becomes impossible to miss: an outage, a customer complaint, data loss. Until then, comfort wins every time.

Those who tolerate comfort-winning aren't engineers and shouldn't be admitted to stand close to engineering systems overall, especially outside the software industry.

by instig007

2/26/2026 at 1:20:39 PM

It’s a trust issue. There’s no one more of a PITA than a new team member who joins and starts questioning every little thing and demanding it be changed (the initial questioning is fine, so long as you accept “because” as a reason). OF COURSE any team that’s shipping software will have things that don’t make sense prima facie, because they’re accumulated tech debt or historical accident.

Go beyond identifying all these problems towards solving them. Choose a small problem, where you won’t have to fight and argue, just a little dust bunny you can sweep out of the way. Do it again, and again, and again. This is how you build trust. As you build trust, it becomes easier to seek change.

Additionally, you may also find that not all the little problems are worth solving, and what’s more interesting are the bigger problems around product-market fit, usability, and revenue.

by jonstewart

2/26/2026 at 1:51:01 PM

> Additionally, you may also find that not all the little problems are worth solving, and what’s more interesting are the bigger problems around product-market fit, usability, and revenue.

TFA author (and me), and you have wildly different motivations. I don't know the author, but have said verbatim much of what they wrote, so I feel like I can speak on this.

Beyond the fact that I recognize the company has to continue exist for me to be employed, none of those hold the slightest bit of interest for me. What motivates me are interesting technical challenges, full stop. As an example, recently at my job we had a forced AI-Only week, where everyone had to use Claude Code, zero manual coding. This was agony to me, because I could see it making mistakes that I could fix in seconds, but instead I had to try to patiently explain what I needed to be done, and then twiddle my thumbs while cheerful nonsense words danced around the screen. One of the things I produced from that was a series of linters to catch sub-optimal schema decisions in PRs. This was praised, but I got absolutely no joy from it, because I didn't write it. I have written linters that parse code using its AST before, and those did bring me joy, because it was an interesting technical challenge. Instead, all I did was (partially) solve a human challenge; to me, that's just frustration manifest, because in my mind if you don't know how to use a DB, you shouldn't be allowed to use the DB (in prod - you have to learn, obviously).

I am fully aware that this is largely incompatible with most workplaces, and that my expectations are unrealistic, but that doesn't change the fact that it is how I feel.

by sgarland

2/26/2026 at 2:06:14 PM

Don't really have anything to add but I do want to say you're not alone - I feel very similarly about AI tooling, the level of satisfaction I get from using them (none), the need for interesting technical challenges, etc. etc.

by miningape

2/26/2026 at 2:18:16 PM

There are dozens of us!

Re: AI, that's not to say I don't use it, I just view it as a sometimes useful tool that you have to watch very closely. I also often view their use as an X-Y problem.

Another recent example: during the same AI week, someone made an AI Skill (I'm not sure how that counts as software, but I digress) that connects to Buildkite to find failed builds, then matches the symptoms back to commit[s]. In their demo, they showed it successfully doing so for something that "took them hours to solve the day before." The issue was having deployed code before its sibling schema migration.

While I was initially baffled at how they missed the logs that very clearly said "<table_name> not found," after having Claude go do something similar for me later, I realized it's at least partially because our logs are just spamming bullshit constantly. 5000-10000 lines isn't uncommon. Maybe if you weren't mislabeling what are clearly DEBUG messages as INFO, and if you didn't have so many abstractions and libraries that the stack traces are hundreds of lines deep, you wouldn't need an LLM to find the needle in the haystack for you.

by sgarland

2/26/2026 at 2:09:20 PM

I’m a development manager and senior developer. I have seen the described behavior from TFA play out on several different teams. Sometimes such team members learn to adapt their approach while holding onto their ideals, and they become valued colleagues. Other times they don’t and they leave out of frustration or are fired or spin their wheels. I have no doubt there’s a great deal of truth in the author’s description, but there’s also maybe some truth in the feedback they’ve received.

I also share some of your philosophy — life is too short for us not to find joy at work, if we can. It’s a lot easier to find that joy when the team’s shipping valuable software, of course.

by jonstewart

2/26/2026 at 2:24:30 PM

> Sometimes such team members learn to adapt their approach while holding onto their ideals, and they become valued colleagues.

What's frustrating (I've said that a lot, I know) to me is that my skills are seen as valued, but my opinions aren't. I also have a pathological need to help people, and so when someone asks me, I can't help but patiently explain for the Nth time how a B+tree works (I include docs! I've written internal docs at varying levels!) and why their index design won't work. This is usually met with "Thanks!" because I've solved their problem, until the next problem occurs. When I then point out that they have a systemic issue, and point to the incidents proving this, they don't want to hear it, because that turns "I made an error, and have fixed it" into "I have made a deep architectural mistake," and people apparently cannot stand to be wrong.

That also baffles me - I don't think I'm arrogant or conceited; when I'm wrong, I publicly say so, and explain precisely where I was mistaken, what the correct answer is, and provide references. Being wrong isn't a moral failing, or even necessarily an indictment on your skills, but for some reason, people are deathly afraid to admit they were wrong.

by sgarland

2/26/2026 at 2:13:57 PM

So basically you get hired with 10-15 years of experience and you start nothing but by earning trust fixing small problems for how long? That sounds like a great way to get into the "does not meet expectations" territory very quickly.

by menaerus

2/26/2026 at 1:01:21 PM

Exactly

by avaku

2/26/2026 at 12:59:47 PM

I've seen this pattern play out, and been frustrated by it many times over.

> Authority matching responsibility. That's the only fix I've seen work. Either you get decision-making power that matches the decisions you're already making, or you find a place that treats your judgment as an asset instead of something to manage.

I don't think the solution is to become some kind of dictator. And I don't think it's about not valuing your judgement.

The key issue is a fundamental misalignment of core values. In the examples given, the culture is such that quality is not the highest priority. A system based on consensus only really works if core values are shared, or there will always be discontent. Consensus won't work under these circumstances. You'll never be able to 'trust' your colleagues to 'do the right thing'.

If you care about quality, you have to look for another organisation and have a lot of questions about how they assure quality.

by louwrentius

2/26/2026 at 2:11:20 PM

> The key issue is a fundamental misalignment of core values.

Agreed, but my main frustration is what glitchc wrote a few comments down: "No one actually claims their product is crap and quality doesn't matter."

I have never met anyone in management who will admit that they value velocity over correctness and uptime, but their actions do. If you want to optimize for velocity, growing your user base, expanding your features, that's fine - but you need to acknowledge that you're making a trade-off in doing so. If you're a solo dev, or working at an extremely small shop with high trust, it's possible that you can have high velocity and high quality, but the combination is vanishingly rare at most places.

by sgarland

2/26/2026 at 12:50:33 PM

The world is a big place with all kinds of organizations and people that fit in different ways on those organizations.

Some organizations do in fact optimize for correctedness, and some people are good at it.

Some people are good in everything (totally possible, universe doesn't care about keeping dichotomies). Maybe that technical guy was only technical up until now because it was what added more value. People often don't consider that.

Right now, we're seeing some small changes in value dynamics. It makes us foster those (mostly pointless) meta-conversations about what organizations are and how people fit in them. But the truth stays the same, both are incredibly diverse.

by gaigalas

2/26/2026 at 1:31:56 PM

Okay I'll bite: How does one find these organizations? They all have high quality listed in marketing blurbs on their websites. No one actually claims their product is crap and quality doesn't matter.

by glitchc

2/26/2026 at 2:03:36 PM

IME, here are some signals that a company actually values correctness. This is not all-inclusive, nor is any one of them a guarantee.

* Their codebase is written in something relatively obscure, like Elixir or Haskell.

* They're an infrastructure [0] or monitoring provider.

* They're running their code on VMs, and have a sane instantiation and deployment process.

* They use Foreign Key Constraints in their RDBMS, and can explain and defend their chosen normalization level.

* They're running their own servers in a colo or self-owned datacenter.

And here are some anti-signals. Same disclaimers apply.

* Their backend is written in JS / TS (and to a somewhat lesser extent, Python [1]).

* They're running on K8s with a bunch of CRDs.

* They've posted blog articles about how they solved problems that the industry solved 20 years ago.

* They exclusively or nearly exclusively use NoSQL [2].

0: This is hit or miss; reference the steady decline in uptime from AWS, GitHub, et al.

1: I love Python dearly, and while it can be made excellent, it's a lot easier to make it bad.

2: Modulo places that have a clear need for something like Scylla - use the the right tool for the job, but the right tool is almost never a DocumentDB.

by sgarland

2/26/2026 at 1:59:42 PM

Look at what they do instead, not their marketing. NASA is the obvious and biggest example. They won't be vibe coding and skipping QA any time soon. Probably ever.

Look at any high quality open source software, and the care people put into them. Those are organizations, made up of people, some of them highly technical.

Startups often don't optimize for correctedness. They can't afford it. But that's a niche. Funny enough, it's the one that's being most affected by the shift in value dynamics right now, so I understand that some people here might see the world as just this, but it isn't.

by gaigalas

2/26/2026 at 1:16:53 PM

[dead]

by xorgun