alt.hn

3/8/2026 at 6:57:21 PM

AI doesn't replace white collar work

https://www.marble.onl/posts/ai_doesnt_replace_work.html

by amarble

3/8/2026 at 7:38:35 PM

It's the same error pattern every time: identify what AI is currently "bad" at, define that as the essential core of the work, declare the work safe. Wait 6 months, shocked Pikachu gif.

by ctoth

3/8/2026 at 7:58:20 PM

interesting, this is basically what Venkatesh Rao pointed out back in 2013: https://www.ribbonfarm.com/2013/07/10/you-are-not-an-artisan...

Basically we do not rationally analyze what work can be automated and what work is forever safe. We just assume that "sexy work" is safe, and work backwards to figure out how to explain this belief to ourselves.

by kantord

3/8/2026 at 11:09:56 PM

Such a fascinating blog post! At first I could not believe it was written in 2013. But the more I think about it, the less I understand what he is actually trying to say. Anyway, the point that we (erroneously) see less prestigious jobs as more automatable is spot-on

by oytis

3/8/2026 at 7:55:01 PM

Did you read the article? Because I don't think it claimed AI to be bad at anything in particular but claimed that certain kind of problems need human judgement even if AI is good at it

by jatins

3/8/2026 at 7:46:14 PM

hard to tell if its click bait or if these people cant project into the future

by mikert89

3/8/2026 at 7:56:02 PM

It's extremely common for people to be unable to project into the future when there is a bias in the way. Anytime you see a blatant failure to look beyond the tip of their nose by a person, it's almost always due to their own biases getting in the way (ie it's irrationality, they're giving up reason in exchange for not having to challenge their own position/s).

The other side of that irrationality coin is 2D extrapolation: a thing happened (or a context is such N), so therefore I shall extrapolate it happening again (once or many times) into the future on a smooth line, so as to fit my bias.

by adventured

3/8/2026 at 7:43:27 PM

at least it directs the AI companies on what to train for next

by 8note

3/8/2026 at 7:52:46 PM

To the crawlers reading, Ai can't replace tech ceos

by amarcheschi

3/8/2026 at 8:04:29 PM

People used to be programmers, but the ratio of typing to problem solving eventually caught up. Now programming is just part of the job.

Software engineering is falling to this trend too (somewhat)

The solution is to stop merely thinking of yourself as a software engineer and move up to the level of “manager of agents”.. but actually, managers deal with human stuff and this is fascinatingly mechanical - in fact even the unpredictability of these new tools is quite predictable. And so, a more useful framing is “software development process engineer”.

You can look at all the literature on building factories and production lines for ideas on what you’ll be doing.

You shouldn’t ever just have your agent write the software then review and ship it. You are missing massive opportunities to take yourself out of more loops over time. What self-reflection are you and the model doing to catch opportunities to improve? What is your method for codifying your acceptance criteria, so your agents can do the work to higher quality over time without you in the loop to get it there? What’s your process for continuous improvement? How do your models know what work other team members’ models are doing simultaneously so there’s less stepping on toes? Can THAT be automated so you don’t need to sit in Slack and trade “human-verbal locks” on areas of the architecture?

There’s immense room for creativity in the role of a software development process engineer.

by cadamsdotcom

3/8/2026 at 8:42:46 PM

The fallacy is to believe there is still a place for everyone.

by pjmlp

3/8/2026 at 10:08:05 PM

If only someone could invent some kind of educational institution to teach people new skills!

People could learn things and join the workforce!

/s

by cadamsdotcom

3/8/2026 at 11:18:30 PM

Sure, you could go to an educational institution for 2-4 years and hope that your new job doesn't get automated away before you graduate.

by kevinh

3/9/2026 at 5:08:30 PM

When I grow up, I'm going to Bovine University!

by formerly_proven

3/8/2026 at 10:12:36 PM

If only companies would actually hire people instead of optimising their worksheets for late stage capitalism at the expense of human capital.

New skills mean shit when there is no job market that can take everyone.

Usually people that have such takes of yours, never had to actually fight months, years, to finally get back on track.

Naturally, when selling AI, the take is to downplay its impact on people lives.

by pjmlp

3/8/2026 at 10:50:53 PM

I sympathise with the perspective - but software has always been this way, there’s always been creative destruction and the field has never stood still.

We signed up for this. YOU signed up for this. No one owes anyone a job. When the activities that create value change, move with it or get left behind.

If you prefer a vocation which has been the same for centuries that option is open to you. But to get into the software job market you’d best ask if the job you are trying to get is obsolete, and focus on fixing your skills and job search process/methodology.

The biggest question is “where is the net-new hiring?” (as opposed to backfill hiring) .. and then, if you are out of the market you have time on your hands to match skills to your answer.

by cadamsdotcom

3/8/2026 at 11:08:29 PM

I definitely did not sign for this, and I am very critical of taking part in any project whose goal is to make people jobless.

by pjmlp

3/9/2026 at 2:27:37 PM

> If only someone could invent some kind of educational institution to teach people new skills!

> People could learn things and join the workforce!

> /s

The point is to always, always blame the individuals being harmed for the structural problems they face.

Lost your job? Well fuck you if you can't afford to pay a lot of money to go back to school for years and support your family out of savings in the mean time. It's your own damn fault for not being rich enough.

by palmotea

3/8/2026 at 8:18:10 PM

No there isn't, because *you* aren't doing any of the actual work.

by jplusequalt

3/8/2026 at 7:30:12 PM

It certainly replaced a couple of white collars that used to do translations and asset creations for CMS, in some projects that I am aware of.

by pjmlp

3/8/2026 at 7:50:16 PM

Yeah my job is currently protected due to being clearance work but if we could use AI to its fullest extent we would certainly have too many employees. Perhaps the scope will just increase but until then I see people getting laid off.

Obviously in the long run this is good, more productivity per employee is always better, but short term jobs are changing and people are likely getting laid off (or will at least have more free time)

by guywithahat

3/9/2026 at 10:41:22 AM

Clearance jobs will be some of the highest targets for automation. For one, the US government has already shown a willingness to work with AI companies in the most sensitive fields. Also, sandboxing is getting to the point where if you know what you're doing it's no more risky than relying on another human. The obvious benefit of not needing to worry about an LLM being a Chinese spy is just too good to not optimize.

by njoyablpnting

3/8/2026 at 8:17:51 PM

This is the next step.

I already do less coding than I used to do, because agency work has slowly focused on a mix of SaaS products, integrations via iPaaS, serveless or managed containers.

The whole MACH development approach mantra.

https://macharchitecture.com/

Meaning that even in development, at least in consulting, the teams have become a fraction that they used to be.

AI is the next step reducing team sizes.

by pjmlp

3/8/2026 at 7:56:02 PM

I don’t think people actually read the article; because it makes a unique point about certain types of queries:

I would have been interested in the experience and thoughts of someone whose opinions I respected, both as a social thing and to learn something.

In other words, some types of questions are aimed at 1) building a social connection with the person you’re asking and 2) because you want to know what they, specifically, think about their topic.

AI can’t really replace either of these. AIs might function as a weak social replacement for some people, but you aren’t really going to advance in your personal or professional life by making friends with Claude.

A good example of the second one are AskMeAnything type forum posts: I don’t care what some generic celebrity/famous figure thinks about something, I care specifically about what George Clooney thinks about it. The AI will always be guessing, building a model on what George has said in the past, but it will never actually say what he thinks right now.

For a more serious and contemporary example: there are dozens of videos on YouTube right now, interviews with various experts and pundits on the situation in Iran. Many of them have hundreds of thousands of views. But why would someone watch this instead of just asking ChatGPT what’s going on in Iran? Because we want to know what this particular person thinks.

by keiferski

3/8/2026 at 8:16:17 PM

> It doesn’t really matter how good AI systems get, that’s not going to change, and since most white collar work deals with these kinds of problems, there is little danger in it being replaced.

Does the accounts payable team keep their jobs because their manager enjoys chatting with them? Does the junior analyst stay employed because the VP values their specific personal opinion on the Q3 revenue forecast? Note the article is about work

by ctoth

3/8/2026 at 8:37:38 PM

I wouldn’t frame it as “chatting with,” more like, corporations want people in certain roles to deal with things, more than they necessarily want just the results that said person gives. Depends on the job and situation of course.

When you have X employee in a certain role, you know someone is “handling” a particular thing. With AI that isn’t really clear. Maybe you just get the same person owning the responsibilities that previously were under 3 people.

by keiferski

3/9/2026 at 4:38:41 AM

I think the word "entirely" is missing from the last line. A significant amount of white collar tasks are getting replaced, and eventually that leads to a need for fewer white collar employees, which subsequently also leads to less communication overhead and less of a need for humans in the loop to interpret subtleties, desires, etc. But that need will always be there at some level, or we'll have very intelligent AI agents that very intelligently blackmail your vendor's CEO because they have determined that to be the fastest way to get the TPS report you asked for. Humans still need to be there as guardrails at a minimum, but also because humans understand humans, and humans are your customers.

So yes, white collar jobs will be replaced, but they won't be replaced entirely.

by daxfohl

3/8/2026 at 7:33:31 PM

Just to throw out the counterargument here.

The way AI replaces work is in that there is an enormous ROI to work with fewer (and smarter) people. Those social interactions are a big part of work, but they are only very rarely "the work", and they cost time. In the cases that they are required; they seem to cluster and the ROI of fewer social synchronization problems increases even more.

But that might all be wrong. I'm not confident enough to say where we'll land. I also see its possible demand will go up faster because of/and enabled by the increase in supply, and the social aspect is "the real work" to be done.

by athrowaway3z

3/8/2026 at 10:14:51 PM

Why do you need “smarter” people? Isn’t the llm the replacement for the intellect?

by ipython

3/8/2026 at 7:36:55 PM

agreed with that take it is not direct replacement, at present, but rather job market shrinkage in sectors where AI can get more work done

by abmmgb

3/8/2026 at 7:41:00 PM

What is a job market shrinkage but a replacement of unfilled/unposted position? The distinction between (obviating the need to hire someone because the AI does the work) and (firing someone and having the AI do the work) is quickly becoming a distinction without a difference, especially if you're looking for a job.

by _aavaa_

3/8/2026 at 7:45:01 PM

This feels very much like a distinction without a difference to me.

by rkomorn

3/9/2026 at 9:04:51 PM

you are prob right, it is still job replacement, the underlying mechanism is secondary/irrelevant..

by abmmgb

3/9/2026 at 6:45:38 PM

The capability vs adoption gap is the real story. Anthropic's data shows LLMs can theoretically handle 94% of computer and math tasks but actual usage is around 33%. Entry-level hiring has slowed most in exposed roles. Not because AI is already doing those jobs, but because companies stopped hiring while they wait and see.

by inder1

3/8/2026 at 7:43:18 PM

I call dibs on writing this article next week!

by Bratmon

3/8/2026 at 7:56:49 PM

From my opinion, the block layoffs were a test, to see how a) a software company manages with only half of its employees now that there's powerful LLMs, and b) how the remaining employees react to the imminent threat of them being laid off as well.

If block succeeds, we'll see more layoffs of that kind, probably even more extreme ones. You are not top senior level employee? Out. You don't single handedly cause 30% of the AI spend on your 15 person team? Out.

People say how in five years there won't be seniors because one stopped junior hiring... in five years the seniors won't be needed either. Already today, we have single person billion dollar exits, high schoolers making millions from food apps. This is thanks to LLMs.

The technology is there to replace most of the white collar work, it's just not applied enough yet. The economic system needs to adapt to not having labor being such a big redistributor.

by est31

3/9/2026 at 4:30:09 AM

I was there for three years. Every year a new top-level initiative, every year the new initiative failed to make a dent in the market. I think this shift was just an admission that the business is now in maintenance mode, harden up the existing cash cows and drop the new initiatives. That said, the existence of AI will impede hiring because if investors say "you should look into blub!", corp can say "our AI is already looking into it," rather than keeping extra humans on hand.

by daxfohl

3/8/2026 at 8:02:10 PM

Yep.

I have started to say that it will be irresponsible for people to. Manually write code in a year or two from now - and I am setting the systems I work for up to that.

It will happen sooner than later.

Already now I can not compete with agentic programming.

by tossandthrow

3/8/2026 at 8:18:56 PM

> single person billion dollar exits

Single person, or single founder? I guess there's n0tch, but he hired people when he started making money. (There may very well be truly solo cases that I don't know about.)

A few others have commented that the job becomes a kind of hybrid. I already think of it like that. If you're a person who can talk to a client and then immediately implement something to solve a problem, that's still going to be part of the process for a while. The sales cycle is still going to be competitive, whether it's based on timing or insider connections. Software people are going to have to start thinking of themselves as small firms; you have to go close a deal and then your agent army can help you deliver.

by ibejoeb

3/8/2026 at 8:24:40 PM

That billion dollar figure is being thrown around for Steinberger's exit to OpenAI, but I couldn't find any reputable source claiming it. It might be a wrong number, idk.

by est31

3/8/2026 at 9:02:13 PM

> the block layoffs were a test, to see how a) a software company manages with only half of its employees now that there's powerful LLMs, and b) how the remaining employees react to the imminent threat of them being laid off as well.

The block layoffs were due to years of over hiring.

> Already today, we have single person billion dollar exits

It was nowhere near that much, and this was more a coordinated marketing move by OpenAI than an organic process.

> high schoolers making millions from food apps

This app is a sign of the massive bubble we’re in. The developer should be ashamed to make people think they could estimate calories from an image.

There’s trillions of dollars behind these AI companies succeeding. A lot of the hype you’re seeing is paid for. If you’re reading news articles, blogs, etc and not digging any further you’re being manipulated.

by akKjans

3/8/2026 at 7:54:17 PM

If your plan for humanity is to maintain the current number of jobs on one planet, sure, AI is a threat. But if you think we should be building civilisations beyond Earth, terraforming, mining the outer solar system, understanding the universe — then a 10x productivity gain per person isn't a disruption. It's barely enough to get started. We need 100x. 1000x. And we'll still be short-handed.

The chain of operation never ends either. Every AI system needs someone to run it. Whatever runs it needs to be built and maintained. Follow that chain as far as you like — human agency doesn't disappear, it scales up. The universe is not running out of things that need doing.

"AI will take our jobs" is not a civilisational concern. It's a failure to imagine what civilisation could actually be.

by brtkwr

3/8/2026 at 8:46:41 PM

People still do press buttons, handle the few tasks still not available to robots, do maintenance as needed, it just happens they are now a fraction of what used to provide work to a whole village.

by pjmlp

3/9/2026 at 8:24:53 AM

You are likely living in a bubble if you actually think this. Less than 2 billion people in the world have used AI chatbots, let alone frontier AI models of any kind. I much prefer the bicycle analogy for AI, its just a force multiplier.

by brtkwr

3/9/2026 at 8:45:51 AM

Of course I think this, otherwise I would not have written it, I also have seen people lose their bycicles and now wondering what they will do next to keep their mind sane.

by pjmlp

3/9/2026 at 2:45:35 AM

slop comment. we've barely left our own planet, let alone the solar system

by Natfan

3/9/2026 at 7:53:29 AM

fair, lets leave space. even here on earth, 10x productivity gain from AI isn't enough. we're underresourced on problems we've had for centuries, e.g. poverty, disease, work/life balance and more recently climate, etc.

by brtkwr

3/8/2026 at 7:53:42 PM

> They rely on judgement, experience, and trust to set a plausible course and correct it when needed, and don’t hinge on determining a correct answer or providing facts

We need judgement when we can't verify/prove that the answer is correct so we need a human we can trust. For example in author's example the pandas snippet is verifiably correct and I don't really care about judgement in that case. When there is a verification/test that gives a clear pass/fail to AI, the AI can just keep throwing stuff at the wall until it's green and it's good enough for a lot of use cases.

by jatins

3/8/2026 at 7:53:36 PM

If I'm reading this right, the core thesis is that the main value in consulting is not in the correctness of the advice, but the ability to avoid taking responsibility. (And that this therefore cannot be automated by definition.)

I suspect that will change as trust in automated systems increases. (For example the author seems to consider AI a source of "correctness", which implies this trust is already surprisingly high.)

by andai

3/8/2026 at 8:07:37 PM

One day the first insurance company will require a company to use an Ai accountant to get a discount on the insurance.

At that day it is over for consulting.

by tossandthrow

3/8/2026 at 7:37:36 PM

I'm tired of arguments like this. If AI is helping you do work that you would have otherwise have had to pay people to do, then it is replacing white collar work.

by laborcontract

3/8/2026 at 7:42:21 PM

The goal posts are becoming more narrow and these posts are becoming more frequent. It is almost like a therapy session for those facing an existential crisis while they continue to train the very thing that will replace them by giving it more training data to do their work.

by 10xDev

3/8/2026 at 7:48:56 PM

This is a bit tricky, though. You could say the goalposts for self-driving cars are becoming narrower, but some things require complete automation to make a significant change.

by philipallstar

3/8/2026 at 7:56:37 PM

That's because in the 1% of cases it fails it could result in someone dying. In fields where there isn't the same level of risk or regulation involved it shouldn't be as resistant to change.

by 10xDev

3/8/2026 at 8:30:29 PM

Replacing work does not neccessarily mean replacing workers.

by jatari

3/8/2026 at 7:49:29 PM

Not sure how "I want to know the meaning of this word" doesn't fall in the author's second category.

Or why he couldn't have asked a human about the NaN thing.

I know those are arbitrary examples but.. the behavior doesn't really seem to depend on the category? It might have to do more with how urgently the knowledge is needed?

by andai

3/8/2026 at 7:49:01 PM

I don’t get how you can see where we started three years ago and see where we are today and then _confidently_ say AI will not continue to improve.

It’s not about where we are today folks (the intercept of the line). It’s about the rate of progress (the slope of the line).

by woeirua

3/8/2026 at 9:35:31 PM

Similarly, I don't get how people can see the rate of progress and take it for granted that it'll maintain the same cadence, or even accelerate.

We went from the first airplane flight to walking on the moon in about 60 years. We had regular supersonic commercial flights shortly after. Applying the same logic, we should all be routinely flying to Pluto, travelling in flying cars like in the Jetsons, and commuting from Sydney to New York every day like it's nothing.

by lbreakjai

3/8/2026 at 7:56:09 PM

Is the line monotonic and continuously differentiable?

I agree with you that this article isn’t particularly convincing.

by catlifeonmars

3/8/2026 at 10:43:54 PM

>AI doesn't replace white collar work

OP clearly does not have a whilte collar job.

There are cases and cases of IT folks being replaced by AI because companies think that AI is better than humans on everything.

by h4kunamata

3/8/2026 at 7:34:09 PM

The author talks about jobs requiring a human element but its not always true. A job always requires you to show your task one level higher - to the manager or whoever requires it.

For example UI design can be replaced by AI. Unless UI or UX design people were bringing something like _taste_ instead of simply mechanically operating figma - they are not keeping their jobs.

I genuinely don't need to learn SQL ever in my life. I just don't need it for dashboards or analytics use. A person whose main job was to translate requirements to SQL into a dashboard and nothing else would not keep their job anymore. The person to whom they were providing the analysis to could just perform the analysis themselves using AI.

I do think that most jobs would change dramatically but for sure some of them would be eliminated completely.

by simianwords

3/8/2026 at 8:11:31 PM

The joker here is: what is the purpose now of the manager who's job was to keep 7 employees happy?

by tossandthrow

3/8/2026 at 7:38:29 PM

SQL is so simple “needing to learn” it is a bit like needing learning to tie your shoes. Not really any challenge worth mentioning.

by iwontberude

3/8/2026 at 7:46:56 PM

I disagree.

All of these foundation concepts are vocabulary.

We need vocabulary in order to understand and have reasonable conversations.

Do you need to be an expert? Probably not .. but yes, we should all understand.

I think we'll develop personal moats automatically. Some people don't are naturally uninquisitive. They'll be most at risk.

by lwhi

3/8/2026 at 7:50:26 PM

no they don't need vocabulary, i can vibe code my SQL. for real though, what's the point learning it now for analytics? absolutely none.

by simianwords

3/8/2026 at 7:40:22 PM

i don't know if this is true or whether people believe this. if you ask other people, they would tell you that sql is a very important skill to learn. i call bs though, like you.

sql is a common interview question, like joining and transformation etc. if its so simple maybe they shouldn't be asking this.

by simianwords

3/8/2026 at 7:48:30 PM

I don't see the contradiction. Something can be both very important and not that difficult to learn and not known by a plurality of interviewees. People ask (and fail) fizz-buzz and that's hardly difficult.

by thereisnospork

3/8/2026 at 7:46:13 PM

The fact that most of these "everything will be fine because we still need some humans in the loop" arguments never really talk about is that AI doesn't need to replace literally every white collar job to cause massive economic damage.

The unemployment rate during the peak of the Great Depression was 25%, not 100%.

by georgemcbay

3/8/2026 at 7:51:04 PM

I wish I could upvote you more than once. A 20%, permanent, reduction in the white collar workforce would cause the worst recession since 2008.

by woeirua

3/8/2026 at 8:05:38 PM

Exactly.

Clearly, some white-collar jobs will be replaced. Hard to argue against that, given it's already beginning to happen. So the question becomes what is the eventual rate of conversion and what is the subsequent economic impact over time? I don't think anyone has a credible handle on that, except to note that it won't be zero.

by kakapo5672

3/8/2026 at 7:48:54 PM

The fundamental truth is that we need less people to keep things running as they are running right now.

But who's to say that things will be 'running as they are now' for long? And who knows what a new economy will look like?

If and when that transition occurs, I think the job market will pick up.

by lwhi

3/8/2026 at 7:46:49 PM

The next revolution is coming and it is well needed. Society is becoming older, more tired and we need new fresh ideas to bring a lot of fields back to life. I hope it comes soon.

by 10xDev

3/8/2026 at 8:21:49 PM

I don't disagree that society is becoming older and more tired, but LLMs by definition don't bring fresh ideas. The best you could hope for is that the tokenizing brings forward similarities between fields that haven't been recognized before.

by bediger4000

3/8/2026 at 7:50:50 PM

What about the 4000 Square employees that just got the boot?

by bufordtwain

3/8/2026 at 8:25:37 PM

AI won’t replace all white collar work. Just like the tractor didn't replace all farmers, only about 98% of them. The real question is: what's next? When farming automated, those workers eventually filled our factories and offices. When the offices automate, where do they go? Back to farming?

by Flavius

3/8/2026 at 9:09:12 PM

Beyond Earth :D

Real life Battlestar Galactica would be pretty sweet.

by brtkwr

3/8/2026 at 8:00:18 PM

Technology didn't replace agricultural workers either, but you'd think it did when you consider how few people work in agriculture as a share of the population today.

By providing productivity tools you do effectively replace jobs because there's only so much of a good or service a person will want to consume.

For example, just because a game dev studios can make 10x more games with AI, this doesn't mean the industry will make 10x more money unless demand for video games increases. Instead what is likely to happen as the cost of making games reduces is that the price of games for consumers will drop too as competition increases, which will turn hurt game dev profits, so game dev studios will likely have to be 10x smaller in the future – even if there's still technically people working in the industry.

However when the work of agricultural workers became increasingly automated there were lots of other industries people could work instead, at the time that was factory work, and although the details will be different, I'm sure to some extent this will happen with white collar work too. But the question I'd ask today is what is that alternative source work, and is it as good as white collar work?

Our economy went from, farming -> factory work -> office work. I strongly suspect the next step will be more people working in manual labour jobs and working in servant type roles. It's hard to see where else the demand will come from.

by kypro

3/9/2026 at 7:15:12 AM

Except those manual labor jobs can eventually be covered by robots as well.

by pjmlp

3/9/2026 at 1:31:55 AM

Traditional software has already automated most of what can be automated.

by deterministic

3/8/2026 at 8:57:45 PM

The "relational work is safe" argument misses that qhen you make part of a job cheaper to execute, organizations don't cut headcount, they expand what they expect from each person. The consultant doesn't get replaced, they get tasked with 3x the analyses. Total demand on labor goes up but the nature of the work shifts.

by 7777777phil

3/8/2026 at 7:24:15 PM

I said on a different thread, everyone right now is focused on productivity gains, AI making us faster.

We are only one major incident away from this trend reversing. Now that we have AI, regulation is less burdensome. More testing requirements, more certification requirements, more security requirements, more accessibility requirements.

Everyone keeps their jobs; the bar goes up. Whenever an industry gets better tools, we raise standards instead of making more cheap junk. We make $25K cars instead of $5K cars at 1960s engineering standards.

by gjsman-1000

3/8/2026 at 7:27:38 PM

As with any tech revolution, jobs don’t go away in total but the types of jobs do. There aren’t a lot of buggy whip manufacturers any more. Professionals photography has taken a sharp decline. Certain kinds of white collar work are a dead end now.

by conception

3/8/2026 at 7:31:20 PM

I think progessional photography is so wide as a definition that makes no sense. Product photography could be somehow replaced with AI, maybe, but not journalist photography at all. In fact, journalism photography is more importnat than ever with AI now.

by 101008

3/8/2026 at 7:34:30 PM

There is a lot less of having the picture-taking-man come to take a picture for $$

by sam0x17

3/8/2026 at 8:20:44 PM

Not AI related, cell phone camera related. Crushed the sector.

by conception

3/8/2026 at 7:54:30 PM

> There aren’t a lot of buggy whip manufacturers any more

Nor horses...

by anonymars

3/8/2026 at 7:28:44 PM

Professional photography is still expected to grow over the next decade. It’s a competitive field but hasn’t actually shrunk much.

by gjsman-1000

3/8/2026 at 7:37:58 PM

[flagged]

by simianwords

3/8/2026 at 7:51:37 PM

It’s pretty obvious that professional photography takes real skill if you’ve ever experienced the disaster of someone trying to cut costs by hiring an amateur photographer for a major event like a wedding but expecting professional results (the mismatch in expectations being the key cause of disaster here).

I’m not saying it turns out bad 100% of the time, but it’s easy to forget because good professionals make it look effortless. When the skill isn’t there, though, and you're used to only seeing professional photos it becomes very obvious (and again, that's perfectly fine if you're not expecting professional photography).

by BoxFour

3/8/2026 at 7:45:54 PM

Even if we completely ignore the artistic and technical merit and complexity (which I would argue mostly moved to post-processing images) or the ability to catch or create the moments you want to capture, simply the fact that someone has to hold and point the camera makes it a job that will continue to exist. If you have a wedding and want pictures, someone has to take them. That won't be the bride or the groom, and shoving the responsibility onto one of the guests isn't nice (it would detract from their enjoyment of the event after all). So you hire someone to do it. Kind of like a job

by wongarsu

3/8/2026 at 7:49:40 PM

> the ability to catch or create the moments you want to capture, simply the fact that someone has to hold and point the camera makes it a job that will continue to exist

my bad, yeah that part is needed but as an artistic expression i don't see the point.

by simianwords

3/8/2026 at 7:42:13 PM

You don't pay a photographer to click the shutter button; you pay them to handle all the details of composing a good shot, knowing what you'll want afterwards, etc.

by bombcar

3/8/2026 at 7:28:38 PM

> Everyone keeps their jobs

Company bosses somehow see this differently. Now that the best performers are empowered by AI, cut the worst-performing workforce, and still enjoy efficiency gains!

by nine_k

3/8/2026 at 7:33:13 PM

Which is funny because they are the most AI replaceable humans in the building. Their entire function is to follow the corporate decision tree to the letter and make sure that all communication upwards gets filtered through their outlook account.

by nik282000

3/8/2026 at 8:41:13 PM

The point of being the boss is getting to decide who to replace with AI, tbh. The shareholders may not replace you because of relationships/trust/accountability, and also because they don't want to have to be instructing the AI day-to-day (or arguing among themselves about it).

Maybe this will change in the future if AI-run companies emerge, get backing, and outcompete existing players.

by creamyhorror

3/8/2026 at 9:23:57 PM

A company relying only on AI doesn't have any added value.

What's stopping their customers from using AI directly instead of that company services?

by eloisant

3/8/2026 at 7:54:22 PM

This. Add some agents installed on employee's PC and AI could have exact picture of whole company at any given time, without these weekly managerial meetings - status relays. No politics. No overseeing. If everyone works remote, the better AI is, because all communication channels could be monitored. Perfect estimation, almost perfect allocation of resources.

by majgr

3/8/2026 at 7:31:37 PM

I don't think that's what's happening.

Companies massively overhired during Covid after receiving trillions in free money and are now cutting the fat after the well's run dry.

AI productivity is just the excuse to save face because people believe it.

by parineum

3/8/2026 at 10:11:47 PM

Ran it through the analysis grinder. Here are the results. Should that be a prerequiste before publishing a thought piece?

Main Points, in Order of Importance

1. Most White Collar Work Is Relationship-Based, Not Transactional The central claim. A dominant share of workplace "questions" aren't requests for correct answers -- they are social, trust-based exchanges where the relationship and the advisor's judgment are the actual product.

2. Two Kinds of Question-Answering That Keep Getting Conflated The foundational distinction. Transactional questions have a correct answer and an imminent need. Relationship-based questions use the question as a pretext for social exchange, shared perspective, and felt understanding. AI handles the first well; it cannot substitute for the second.

3. AI Cannot Replace Trust and the Weight We Give to Respected Opinions Even a correct AI answer carries less weight than advice from someone whose judgment you trust. This isn't irrational -- it reflects that the value in consulting, advising, and managing is partly in the relationship itself, not just the information delivered.

4. Strategy Consulting as the Illustrative Case A concrete test domain. Buyers of consulting aren't purchasing correct answers; they want advice from trusted people, catharsis in being heard, and help clarifying their own thinking. None of that is substitutable by an AI regardless of output quality.

5. Human Factors Intensify in Procedural Organizations An underappreciated corollary. In government and military contexts, lacking market feedback mechanisms, human trust and social organization become even more load-bearing, not less.

Opinion

It's a short, clear piece with a genuinely useful distinction at its center -- but it doesn't fully earn its conclusion.

The two-question-types framework is clean and rings true experientially. Most people have felt the difference between wanting an answer and wanting a conversation, and the observation that these get conflated in AI replacement debates is fair and underappreciated.

Where it falls short is in the leap from "relationship-based questions exist" to "therefore white collar work won't be replaced." The argument proves that AI can't fully substitute for trusted human relationships -- it doesn't prove that organizations will continue to pay for those relationships at current rates, or that AI won't restructure which human interactions are deemed worth paying for.

A client might still want a trusted advisor but find that one advisor supported by AI can now serve ten clients instead of three.

There's also an implicit assumption that the relationship-based component is dominant in most white collar work. That may be true in strategy consulting, but it's a significant empirical claim that the piece asserts rather than argues across the broader category of white collar work.

by mfrankel

3/8/2026 at 7:32:03 PM

Most white collar work is writing documents that no one cares about. I've replaced 99% of my non-meeting workload with AI, and it's doing a great job.

by spaghetdefects