2/16/2026 at 2:16:36 AM
Here's a thought. Lets all arbitrarily agree AGI is here. I can't even be bothered discussing what the definition of AGI is. It's just here, accept it. Or vice versa.Now what....? Whats happening right now that should make me care that AGI is here (or not). Whats the magic thing thats happening with AGI that wasn't happening before?
<looks out of window> <checks news websites> <checks social media...briefly> <asks wife>
Right, so, not much has changed from 1-2 years ago that I can tell. The job markets a bit shit if you're in software...is that what we get for billions of dollars spent?
by hi_hi
2/17/2026 at 1:09:33 AM
> Lets all arbitrarily agree AGI is here. I can't even be bothered discussing what the definition of AGI is.There is a definition of AGI the AI companies are using to justify their valuation. It's not what most people would call AGI but it does that job well enough, and you will care when it arrives.
They define it as an AI that can develop other AI's faster than the best team of human engineers. Once they build one of those in house they outpace the competition and become the winner that takes all. Personally I think it's more likely they will all achieve it at a similar time. That would mean the the race will continues, accelerating as fast as they can build data centres and power plants to feed them.
It will impact everyone, because the already dizzying pace of the current advances will accelerate. I don't know about you, but I'm having trouble figuring out what my job will be next year as it is.
An AI that just develops other AI's could hardly be called "general" in my book, but my opinion doesn't count for much.
by rstuart4133
2/16/2026 at 3:19:01 PM
What's happening with AGI depends on what you mean by AGI so "can't even be bothered discussing what the definition" means you can't say what's happening.My usual way of thinking about it is AGI means can do all the stuff humans do which means you'd probably after a while look out the window and see robots building houses and the like. I don't think that's happening for a while yet.
by tim333
2/16/2026 at 6:25:14 PM
Who would the robots build houses for? No one has a job and no one is having kids in that future.by kjkjadksj
2/16/2026 at 8:55:34 PM
Where are the robots going to sleep? Outside in the rain?by elfly
2/16/2026 at 7:56:02 PM
The billionaire elite. Isn’t it obvious? They want to get rid of usby therobots927
2/16/2026 at 6:15:29 PM
Indeed: particularly given that—just as a nonexhaustive "for instance"—one of the fairly common things expected in AGI is that it's sapient. Meaning, essentially, that we have created a new life form, that should be given its own rights.Now, I do not in the least believe that we have created AGI, nor that we are actually close. But you're absolutely right that we can't just handwave away the definitions. They are crucial both to what it means to have AGI, and to whether we do (or soon will) or not.
by danaris
2/16/2026 at 8:52:13 PM
I'm not sure how the rights thing will go. Humans have proved quite able not to give many rights to animals or other groups of humans even if they are quite smart. Then again there was that post yesterday with a lady accusing OpenAI of murdering her AI boyfriend by turning off 4o so no doubt there will be lots of arguments over that stuff. (https://news.ycombinator.com/item?id=47020525)by tim333
2/16/2026 at 4:43:56 AM
Cultural changes take time. It took decades for the internet to move from nerdy curiosity to an essential part of everyone's life.The writing is on the wall. Even if there's no new advances in technology, the current state is upending jobs, education, media, etc
by hackyhacky
2/16/2026 at 4:51:18 AM
I really think corporations are overplaying their hand if they think they can transform society once again in the next 10 years.Rapid de industrialization followed by the internet and social media almost broke our society.
Also, I don’t think people necessarily realize how close we were to the cliff in 2007.
I think another transformation now would rip society apart rather than take us to the great beyond.
by materielle
2/16/2026 at 9:17:05 AM
I worry that if the reality lives up to investors dreams it will be massively disruptive for society which will lead us down dark paths. On the other hand if it _doesn't_ live up to their dreams, then there is so much invested in that dream financially that it will lead to massive societal disruption when the public is left holding the bag, which will also lead us down dark paths.by foo42
2/16/2026 at 11:50:15 AM
It's already made it impossible to trust half of the content i read online.Whenever i use search terms to ask a specific question these days theres usually a page of slop dedicated to the answer which appears top for relevancy.
Once i realize it is slop i realize the relevant information could be hallicinated so i cant trust it.
At the same time im seeing a huge upswing in probable human created content being accused of being slop.
We're seeing a tragedy of the information commons play out on an enormous scale at hyperspeed.
by pydry
2/16/2026 at 3:57:01 PM
You trust nearly half??!!??by Induane
2/16/2026 at 5:30:16 AM
I think corporations can definitely transform society in the near future. I don't think it will be a positive transformation, but it will be a transformation.Most of all, AI will exacerbate the lack of trust in people and institutions that was kicked into high gear by the internet. It will be easy and cheap to convince large numbers of people about almost anything.
by hackyhacky
2/16/2026 at 4:57:44 PM
I'm still not buying that AI will change society anywhere as much as the internet or smart phones for the matter.The internet made it so that you can share and access information in a few minute if not seconds.
Smart phones build on the internet by making this sharing and access of information could done from anywhere and by anyone.
AI seems occupies the same space as google in the broader internet ecosystem.I dont know what AI provides me that a few hours of Google searches. It makes information retrieval faster, but that was the never the hard part. The hard part was understanding the information, so that you're able to apply it to your particalar situation.
Being able to write to-do apps X1000 faster is not innovation!
by the1st
2/16/2026 at 1:24:51 PM
You are assuming that the change can only happen in the west.The rest of the world has mostly been experiencing industrialisation, and was only indirectly affected by the great crash.
If there is a transformation in the rest of the world the west cannot escape it.
A lot of people in the west seem to have their heads in the sand, very much like when Japan and China tried to ignore the west.
China is the world's second biggest economy by nominal GDP, India the fourth. We have a globalised economy where everything is interlinked.
by graemep
2/16/2026 at 2:32:45 PM
When I look at my own country it has proven to be open to change. There are people alive today who remember Christianity now we swear in a gay prime minister.In that sense Western countries have proven that they are intellectualy very nimble.
by expedition32
2/16/2026 at 3:31:54 PM
Three of the best known Christians I have known in my life are gay. Two are priests (one Anglican, one Catholic). Obviously the Catholic priest had taken a vow of celibacy anyway to its entirely immaterial. I did read an interview of a celeb friend (also now a priest!) of his that said he (the priest I knew) thought people did not know he was gay we all knew, just did not make a fuss about it.Even if you accept the idea that gay sex is a sin, the entire basis of Christianity is that we are all sinners. Possessing wealth is a failure to follow Jesus's commands for instance. You should be complaining a lot more if the prime minister is rich. Adultery is clearly a more serious sin than having the wrong sort of sex, and I bet your country has had adulterous prime ministers (the UK certainly has had many!).
I think Christians who are obsessed with homosexuality as somehow making people worse than the rest of us, are both failing to understand Christ's message, and saying more about themselves than gays.
If you look at when sodomy laws were abolished, countries with a Christian heritage lead this. There are reasons in the Christian ethos if choice and redemption for this.
by graemep
2/16/2026 at 2:36:56 PM
> people alive today who remember Christianity now we swear in a gay prime ministerWhy would that be a contradiction? Gay people can't be Christian?
by hackyhacky
2/16/2026 at 5:01:43 AM
As a young adult in 2007, what cliff were we close to?The GFC was a big recession, but I never thought society was near collapse.
by BobbyJo
2/16/2026 at 5:31:52 AM
We were pretty close to a collapse of the existing financial system. Maybe we’d be better off now if it happened, but the interim devastation would have been costly.by edmundsauto
2/16/2026 at 11:14:12 AM
We weren't that far away from ATMs refusing to hand out cash, banks limiting withdrawals from accounts (if your bank hadn't already gone under), and a subsequent complete collapse of the financial system. The only thing that saved us from that was an extraordinary intervention by governments, something I am not sure they would be capable of doing today.by verzali
2/16/2026 at 5:29:51 AM
It felt like the entire global financial system had a chance of collapsing.by zeroonetwothree
2/16/2026 at 5:03:12 AM
yeah, this is a good point, transition and transformation to new technologies takes time. I'm not sure I agree the current state is upending things though. It's forcing some adaption for sure, but the status quo remains.by hi_hi
2/16/2026 at 5:48:51 AM
> It took decadesIt took one September. Then as soon as you could take payments on the internet the rest was inevitable and in _clear_ demand. People got on long waiting lists just to get the technology in their homes.
> no new advances in technology
The reason the internet became so accessible is because Moore was generally correct. There was two corresponding exponential processes that vastly changed the available rate of adoption. This wasn't at all like cars being introduced into society. This was a monumental shift.
I see no advances in LLMs that suggest any form of the same exponential processes exist. In fact the inverse is true. They're not reducing power budgets fast enough to even imagine that they're anywhere near AGI, and even if they were, that they'd ever be able to sustainably power it.
> the current state is upending jobs
The difference is companies fought _against_ the internet because it was so disruptive to their business model. This is quite the opposite. We don't have a labor crisis, we have a retention crisis, because companies do not want to pay fair value for labor. We can wax on and off about technology, and perceptrons, and training techniques, or power budgets, but this fundamental fact seems the hardest to ignore.
If they're wrong this all collapses. If I'm wrong I can learn how to write prompts in a week.
by themafia
2/17/2026 at 1:25:35 AM
what September?by nubg
2/17/2026 at 2:20:08 AM
This is an allusion to the old days, before the internet became a popular phenomenon. It used to be, that every September a bunch of "newbies" (college student who just access to an internet connection for the first time) would log in and make a mess of things. Then, in the late nineties when it really took off, everybody logged in and made a mess of things. This is this the "eternal september." [1]by hackyhacky
2/16/2026 at 6:45:59 AM
> It took one September.It's the classic "slowly, then suddenly" paradigm. It took decades to get to that one September. Then years more before we all had internet in our pocket.
> The reason the internet became so accessible is because Moore was generally correct.
Can you explain how Moore's law is relevant to the rise of the internet? People didn't start buying couches online because their home computer lacked sufficient compute power.
> I see no advances in LLMs that suggest any form of the same exponential processes exist.
LLMs have seen enormous growth in power over the last 3 years. Nothing else comes close. I think they'll continue to get better, but critically: even if LLMs stay exactly as powerful as they are today, it's enough to disrupt society. IMHO we're already at AGI.
> The difference is companies fought _against_ the internet
Some did, some didn't. As in any cultural shift, there were winners and losers. In this shift, too, there will be winner and losers. The panicked spending on data centers right now is a symptom of the desire to be on the right side of that.
> because companies do not want to pay fair value for labor.
Companies have never wanted to pay fair value for labor. That's a fundamental attribute of companies, arising as a consequence of the system of incentives provided in capitalism. In the past, there have been opportunities for labor to fight back: government regulation, unions. This time that won't help.
> If I'm wrong I can learn how to write prompts in a week.
Why would you think that anyone would want you to write prompts?
by hackyhacky
2/16/2026 at 9:53:11 AM
> Cultural changes take time. It took decades for the internet to move from nerdy curiosity to an essential part of everyone's life.99% of people only ever use proprietary networks from FAANG corporations. That's not "the internet", that's an evolution of CompuServe and AOL.
We got TCP/IP and the "web-browser" as a standard UI toolkit stack out of it, but the idea of the world wide web is completely dead.
by otabdeveloper4
2/16/2026 at 3:33:21 PM
Shockingly rare how few realize this. It's a series of mega cities interconnected by ghost towns out here.by rglover
2/16/2026 at 5:00:31 AM
It also took years for the Internet to be usable by most folks. It was hard, expensive and unpractical for decades.Just about the time it hit the mainstream coincidentally, is when the enshitification began to go exponential. Be careful what you wish for.
by webdoodle
2/16/2026 at 5:32:58 AM
Allow me to clarify: I'm not wishing for change. I am an AI pessimist. I think our society is not prepared to deal with what's about to happen. You're right: AI is the key to the enshitification of everything, most of all trust.by hackyhacky
2/16/2026 at 5:45:15 AM
Governments and companies have been pushing for identity management that connects your real life identity with your digital one for quite some time. With AI I believe that's not only a bad thing, maybe unavoidable now.by bulbar
2/16/2026 at 5:11:55 AM
> Here's a thought. Lets all arbitrarily agree AGI is here.A slightly different angle on this - perhaps AGI doesn't matter (or perhaps not in the ways that we think).
LLMs have changed a lot in software in the last 1-2 years (indeed, the last 1-2 months); I don't think it's a wild extrapolation to see that'll come to many domains very soon.
by jwilliams
2/16/2026 at 3:39:04 PM
Which domains? Will we see a lot of changes in plumbing?by nradov
2/16/2026 at 5:33:17 PM
If most of your work involves working with a monitor and keyboard, you're in one of the the domains.Even if it doesn't, you will be indirectly affected. People will flock to trades if knowledge work is no longer a source of viable income.
by joquarky
2/16/2026 at 12:25:55 PM
If AGI is already here actions would be so greatly accelerated humans wouldn’t have time to respond.Remember that weather balloon the US found a few years ago that for days was on the news as a Chinese spy balloon?
Well whether it was a spy balloon or a weather balloon but the first hint of its existence could have triggered a nuclear war that could have already been the end of the world as we know it because AGI will almost certainly be deployed to control the U.S. and Chinese military systems and it would have acted well before any human would have time to intercept its actions.
That’s the apocalyptic nuclear winter scenario.
There are many other scenarios.
An AGI which has been infused with a tremendous amount of ethics so the above doesn’t happen, may also lead to terrible outcomes for a human. An AGI would essentially be a different species (although a non biological one). If it replicated human ethics even when we apply them inconsistently, it would learn that treating other species brutally (we breed, enslave, imprison, torture, and then kill over 80 billion land animals annually in animal agriculture, and possibly trillions of water animals). There’s no reason it wouldn’t do that to us.
Finally, if we infuse it with our ethics but it’s smart enough to apply them consistently (even a basic application of our ethics would have us end animal agriculture immediately), so it realizes that humans are wrong and doesn’t do the same thing to humans, it might still create an existential crisis for humans as our entire identity is based on thinking we are smarter and intellectually superior to all other species, which wouldn’t be true anymore. Further it would erode beliefs in gods and other supernatural BS we believe which might at the very least lead humans to stop reproducing due to the existential despair this might cause.
by hshdhdhj4444
2/16/2026 at 2:42:56 PM
You're talking about superintelligence. AGI is just...an AI that's roughly on par with humans on most things. There's no inherent reason why AGI will lead to ASI.by armoredkitten
2/16/2026 at 3:25:24 PM
What a silly comment. You're literally describing the plot of several sci-fi movies. Nuclear command and control systems are not taken so lightly.And as for the Chinese spy balloon, there was never any risk of a war (at least not from that specific cause). The US, China, Russia, and other countries routinely spy on each other through a variety of unarmed technical means. Occasionally it gets exposed and turns into a diplomatic incident but that's about it. Everyone knows how the game is played.
by nradov
2/16/2026 at 5:59:52 PM
"Nuclear command and control systems are not taken so lightly."https://gizmodo.com/for-20-years-the-nuclear-launch-code-at-...
by user____name
2/16/2026 at 1:21:13 PM
Sounds fun let's do it.by koakuma-chan
2/16/2026 at 1:14:09 PM
AGI is not a death sentence for humanity. It all depends on who leverages the tool. And in any case, AGI won’t be here for decades to come.by deafpolygon
2/16/2026 at 2:30:02 PM
Your sentence seems to imply that we will delegate all AI decisions to one person who can decide how he wants to use it - to build or destroy.Strong agentic AIs are a death sentence memo pad (or a malevolent djinn lamp if you like) that anyone can write on, because the tools will be freely available to leverage. A plutonium breeder reactor in every backyard. Try not to think of paperclips.
by mapt
2/16/2026 at 5:02:52 AM
Before enlightenment^WAGI: chop wood, fetch water, prepare foodAfter enlightenment^WAGI: chop wood, fetch water, prepare food
by CamperBob2
2/16/2026 at 1:39:27 PM
AGI is a pipe dream and will never existby tsukurimashou
2/16/2026 at 5:27:40 PM
Odd to see someone so adamantly insist that we have souls on a forum like HN.by joquarky
2/16/2026 at 10:52:20 PM
people are taking actions based on its advice.by m463
2/16/2026 at 7:09:47 AM
AGI would render humans obsolete and eradicate us sooner or later.by copx
2/16/2026 at 4:11:57 PM
One of the most impactful books I ever read was Alvin Toffler's Future Shock.Its core thesis was: Every era doubled the amount of technological change of the prior era in one half the time.
At the time he wrote the book in 1970, he was making the point that the pace of technological change had, for the first time in human history, rendered the knowledge of society's elders - previously the holders of all valuable information - irrelevant.
The pace of change has continued to steadily increase in the ensuing 55 years.
Edit: grammar
by keernan
2/16/2026 at 8:31:29 AM
Pretty sure marketing team s are already working on AGI v2by Havoc
2/16/2026 at 6:04:40 AM
I think you are missing the point: If we assume that AGI is *not* yet here, but may be here soon, what will change when it arrives? Those changes could be big enough to affect you.by munchler
2/16/2026 at 7:25:14 AM
I'm missing the point? I literally asked the same thing you did.>Now what....? Whats happening right now that should make me care that AGI is here (or not).
Do you have any insight into what those changes might concretely be? Or are you just trying to instil fear in people who lack critical thinking skills?
by hi_hi
2/16/2026 at 6:14:28 PM
You did not ask the same thing. You framed the question such that readers are supposed to look at their current lives and realize nothing is different ergo AGI is lame. Your approach utilizes the availability bias and argument from consequences logical fallacies.I think what you are trying to say is can we define AGI so that we can have an intelligent conversation about what that will mean for our daily lives?. But you oddly introduced your argument by stating you didn't want to explore this definition...
by MadcapJake
2/16/2026 at 3:01:43 PM
The economy is shit if you’re anything except a nurse or providing care to old people.by dyauspitr
2/16/2026 at 3:40:14 PM
Electricians are also doing pretty well. Someone has to wire up those new data centers.by nradov
2/16/2026 at 9:50:44 AM
> The job markets a bit shit if you're in softwareThat's Trump's economy, not LLMs.
by otabdeveloper4
2/16/2026 at 5:46:38 AM
Many devs don’t write code anymore. Can really deliver a lot more per dev.Many people slowly losing jobs and can’t find new ones. You’ll see effects in a few years
by skeptic_ai
2/16/2026 at 5:48:23 AM
Deliver a lot more tech debtby reactordev
2/17/2026 at 1:12:30 AM
My LLMs do create non-zero amounts of tech debt, but they are also massively decreasing human-made tech debt by finding mountains of code that can be removed or refactored when using the newest frameworks.by qingcharles
2/16/2026 at 6:44:33 AM
That tech debt will be cleaned up with a model in 2 years. Not that human don't make tech debt.by dainiusse
2/16/2026 at 8:32:57 AM
What that model is going to do in 2 years is replace tech debt with more complicated tech debt.by shaky-carrousel
2/16/2026 at 9:36:11 AM
One could argue that's a cynically accurate definition of most iterative development anyway.But I don't know that I accept the core assertion. If the engineer is screening the output and using the LLM to generate tests, chances are pretty good it's not going to be worse than human-generated tech debt. If there's more accumulated, it's because there's more output in general.
by geoelectric
2/16/2026 at 2:37:26 PM
Only if you accept the premise that the code generated by LLMs is identical to the developer's output in quality, just higher in volume. In my lived professional experience, that's not the case.It seems to me that prompting agents and reviewing the output just doesn't.... trigger the same neural pathways for people? I constantly see people submit agent generated code with mistakes they would have never made themselves when "handwriting" code.
Until now, the average PR had one author and a couple reviewers. From now on, most PRs will have no authors and only reviewers. We simply have no data about how this will impact both code quality AND people's cognitive abilities over time. If my intuition is correct, it will affect both negatively over time. It remains to be seen. It's definitely not something that the AI hyperenthusiasts think at all about.
by krethh
2/16/2026 at 5:38:45 PM
> In my lived professional experience, that's not the case.In mine it is the case. Anecdata.
But for me, this was over two decades in an underpaid job at an S&P500 writing government software, so maybe you had better peers.
by joquarky
2/17/2026 at 12:45:31 AM
I stated plainly: "we have no data about this". Vibes is all we have.It's not just me though. Loads of people subjectively perceiving a decrease in quality of engineering when relying on agents. You'll find thousands of examples on this site alone.
by krethh
2/16/2026 at 7:17:41 AM
I actually think it is here. Singularity happened. We're just playing catch up at this point.Has it runaway yet? Not sure, but is it currently in the process of increasing intelligence with little input from us? Yes.
Exponential graphs always have a slow curve in the beginning.
by xhcuvuvyc
2/16/2026 at 7:32:07 AM
Didn't you get the memo? Tuesday. Tuesday is when the Singularity happens.Will there still be ice cream after Tuesday? General societal collapse would be hard to bare without ice cream.
by hi_hi
2/16/2026 at 5:44:24 PM
Tuesday at 4 p.m to be specific.by joquarky
2/16/2026 at 6:01:49 AM
I've been writing code for 20 years. AI has completely changed my life and the way I write code and run my business. Nothing is the same anymore, and I feel I will be saying that again by the end of 2026. My productive output as a programmer in software and business have expanded 3x *compounding monthly*.by znnajdla
2/16/2026 at 6:15:19 AM
>My productive output as a programmer in software and business have expanded 3x compounding monthly.In what units?
by myegorov
2/16/2026 at 12:56:26 PM
Tasks completed in my todo list software I’ve been measuring my output for 5 years. Time saved because I built one off tools to automate many common workflows. And yes even dollars earned.I don’t mean 3x compounding monthly every month, I mean 3x total since I started using Claude Code about 6 months ago but the benefits keep compounding.
by znnajdla
2/16/2026 at 9:02:53 AM
GWhby freshbreath
2/16/2026 at 12:55:37 PM
Going from gigajoules to terajoules.by tmtvl
2/16/2026 at 8:51:09 AM
Vibesby merek
2/16/2026 at 6:46:28 AM
Going from punch cards to terminals also "completely changed my life and the way I write code and run my business"Firefox introducing their dev debugger many years ago "completely changed my life and the way I write code and run my business"
You get the idea. Yes, the day to day job of software engineering has changed. The world at large cares not one jot.
by hi_hi
2/16/2026 at 5:34:00 PM
I mean 2025 had the weakest job creation growth numbers outside of recession periods since at least 2003. The world seems to care in a pretty tangible way. There are other big influencing factors for that, too, of course.by brynnbee
2/16/2026 at 6:10:59 AM
Okay. So software engineers are vastly more efficient. Good I guess. "Revolutionize the entire world such that we rethink society down to its very basics like money and ownership" doesn't follow from that.by UncleMeat
2/16/2026 at 6:20:15 AM
Man you guys are impatient. It takes decades even for earth shattering technologies to mature and take root.by pennomi
2/16/2026 at 6:55:44 AM
Damn right I'm impatient. My eye starts twitching when a web page takes more than 2 seconds to load :-)In the meantime, I've had to continuously hear talk about AI, both in real life (like at the local pub) AND virtually (tv/radio/news/whatever) and how it's going to change the world in unimaginable ways for the last...2/3 years. Billions upon billions of dollars are being spent. The only tangible thing we have to show is software development, and some other fairly niche jobs, have changed _a bit_.
So yeah, excuse my impatience for the bubble to burst, I can stop having to hear about this shit every day, and I can go about my job using the new tools we have been gifted, while still doing all the other jobs that sadly do not benefit in any similar way.
by hi_hi
2/16/2026 at 9:57:15 AM
> The only tangible thing we have to show is software development, and some other fairly niche jobs, have changed _a bit_.There is zero evidence that LLMs have changed software development efficiency.
We get an earth-shattering developer productivity gamechanger every five years. All of them make wild claims, none of them ever have any data to back those claims up.
LLMs are just another in a long, long list. This too will pass. (Give it five years for the next gamechanger.)
by otabdeveloper4
2/16/2026 at 3:01:47 PM
If people want to make the "this will be AGI after two decades and will totally revolutionize the entire world" that's fine. If people want to make the "wow this is an incredibly useful tool for many jobs that will make work more efficient" that's fine. We can have those discussions.What I don't buy is the "in two years there will be no more concept of money or poverty because AI has solved everything" argument using the evidence that these tools are really good at coding.
by UncleMeat
2/16/2026 at 6:18:27 AM
Are you working for 3x less the time compounding monthly?Are you making 3x the money compounding monthly ?
No?
Then what's the point?
by waterTanuki
2/16/2026 at 12:54:30 PM
Yes and yes.by znnajdla
2/16/2026 at 1:27:49 PM
Okay, teach me how, then? I would also like to work 3× less and make 3× more.by timeattack
2/16/2026 at 5:49:31 PM
People keep impatiently expecting proof from builders with no moat. It's like that Upton Sinclair quote.by joquarky
2/16/2026 at 5:34:55 PM
Start a software business, presumably.by brynnbee
2/16/2026 at 4:44:14 PM
Ten more months in 2026, so you should be about 60,000x better by the end of the year.by kbelder
2/16/2026 at 5:12:58 PM
You say that as if it’s impossible but there are several indie makers that have gone from $10 MRR to $600k MRR over the past 8 months.by znnajdla
2/16/2026 at 2:10:05 PM
It's weird that you guys keep posting the same comments with the exact same formattingYou're not fooling anyone
by hackable_sand