3/7/2026 at 3:10:17 PM
The Megahertz Wars were an exciting time. Going from 75 MHz to 200 MHz meant that everything (CPU limited) ran 2x as fast (or better with architectural improvements).Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs.
by xnx
3/7/2026 at 4:24:55 PM
In my experience, SSDs had a bigger impact. Thanks to Wirth's Law (https://en.wikipedia.org/wiki/Wirth%27s_law) the steady across-the-board increase in processing power didn't equate to programs running much faster, e.g. Discord running on a modern computer isn't any more responsive, if not less responsive than an ICQ client was running on a computer 25 years ago.SSDs provided a huge bump in performance to each individual computer, but trickled their way into market saturation over a generation or two of computers, so you'd be effectively running the same software but in a much more responsive environment.
by dlcarrier
3/7/2026 at 5:02:30 PM
Anytime you upgraded from a 4 year old computer to a new one back then - from 16Mhz to 90Mhz, or 75Mhz to 333Mhz, or 333Mhz to 1Ghz, or whatever - it was immediate, it was visceral.SSDs booted faster and launched programs faster and were a very nice change, but they weren't that same sort of night-and-day 80s/90s era change.
The software, in those days, was similarly making much bigger leaps every few years. 256 colors to millions, resolution, capabilities (real time spellcheck! a miracle at the time.) A chat app isn't a great comparison. Games are the most extreme example - Sim City to Sim City 2000; Doom to Quake; Unreal Tournament to Battlefield 1942 - but consider also a 1995 web browser vs a 1999 one.
by majormajor
3/7/2026 at 8:00:43 PM
For me, at 52, I recall the SSD transformation to be near miraculous. I never once felt that way about a CPU upgrade until getting an M1. I went from a cyrix 5x86 133 (which was effectively a fast 486) to a pentium II 266 and it just wasn't that impressive.The drag down of swapping became almost a non-issue with the SSD changeover.
I suppose going from a //e to a IIgs was that kind of leap but that was more about the whole computer than a cpu.
Now I have to say, swapping to an SSD on my windows machines at work was far less impressive than going to SSD with my macs. I sort of wrote that off as all the anivirus crap that was running. It was very disappointing compared to the transformation on mac. On my macs it was like I suddenly heard the hallelujah chorus when I powered on.
by y1n0
3/7/2026 at 9:42:38 PM
I went 386 DX 33 to a Pentium 75, which wasn't a wild amount of time. I'd argue that's way bigger than when I got an SSD (but I agree SSD was a huge improvement).by ubercore
3/8/2026 at 2:58:59 AM
I went from a 1Mhz Apple //e in 1986 to a Mac LCII in 1992 with a 68030/16 MHz LCII. That was the last time I felt a step change in day to day work. Of course things like games, video and audio encoding got faster.The next time I felt a step change was the M series of Macs. P
by raw_anon_1111
3/7/2026 at 9:34:54 PM
I agree. There were only 2 game changing upgrades for me. One was hard disk to SSD. The other was x86 laptop to M1.by aurareturn
3/7/2026 at 11:05:28 PM
You really didn't feel Pentium 4 to Core 2 Duo was a 'game changer'?by bigDinosaur
3/8/2026 at 4:49:33 AM
Software was already far down the bloat path by the time the Core 2 Duo came out, so the upgrade didn't make all that much of a difference in feel given how much latency was caused by software performing random reads off a disk. That's why SSDs made such a huge difference.Back in the MS-DOS days, the amount of data needed to be read off a disk while the OS booted was negligible, so a second or two on a fast 486 felt amazing compared to the incredibly slow grind of watching code execute on an 8086 or slow 80286. Software was still in the space of having to run tolerably on an 8086, so the added resources of a newer faster machine actually did improve the feel of the system.
by bcrl
3/8/2026 at 9:04:19 AM
Athlon 3200+ to core 2 duo. Not it didn’t feel as much as M1.M1 allowed me to do things I thought was impossible which was fast, fanless, cool, and extremely long battery life.
by aurareturn
3/7/2026 at 11:34:37 PM
Moving from floppy disk to hard disk was pretty big for me. :)by qsi
3/8/2026 at 12:52:29 AM
Hey, moving from cassette tape to floppy was also pretty awesome - random access speed demon!by trailbits
3/8/2026 at 4:54:19 AM
Absolutely. I was amazed with going from C64 datasette to Amiga 500 floppy.by snvzz
3/7/2026 at 5:55:15 PM
That's my point, the software was getting bloated at least as fast as the CPUs were getting faster, so you had to upgrade to a new CPU every few years to run the latest software. With SSDs, there was a huge overlap in CPU speeds that may or may not have an SSD, so upgrading to one meant a huge performance boost, within the same set of runnable software.Also, going from Sim City to Sim City 2000 was pre-bloat. Over the course of five years, the new version was significantly better than the original, but they both target the same 486 processor generation, which was brand new when the original SimCity was released, but rather old by the time SimCity 2000 was released. Another five years later, Sim City 3000 added minimal functionality, but required not just a Pentium processor, but a fast one.
I guess what I'm getting at is that a faster CPU means programs released after it will run better, but faster storage means that all programs, old and new, will run better.
by dlcarrier
3/7/2026 at 6:06:20 PM
> That's my point, the software was getting bloated at least as fast as the CPUs were getting fasterI think there's a difference between bloat and actually useful features or performance.
For example, I started making music with computers in the early 90s. They were only powerful enough to control external equipment like synthesizers.
Nowadays, I can do everything I could do with all that equipment on an iPad! I would not call that bloat.
On the other hand, comparing MS Teams to say ICQ, yeah, a lot of that is bloat.
by steve1977
3/7/2026 at 6:19:04 PM
> in the early 90s. They were only powerful enough to control external equipment like synthesizers.Tell that to ScreamTracker!
by myself248
3/7/2026 at 6:50:03 PM
In case anyone's wondering:by matheusmoreira
3/7/2026 at 7:20:19 PM
Screamtracker was sampling. Great for the days and much more accesible for the teenager I was than buying and controlling synths but that was not exactly same. More a competition to the early akai MPCs.And we were mostly ripping those samples from records on cassettes and CDs, or other mods.
by prmoustache
3/7/2026 at 8:55:42 PM
Well now that you mention that, my very first steps actually were with Soundmonitor on a C64, one of the OG trackers probably (even though not called tracker yet IIRC). I kind of forgot about that, as that was still very amateurish (I mean what I made with it, not the software).https://www.c64-wiki.de/images/f/f1/rockmon3.png
Or also at https://www.youtube.com/watch?v=roBkg-iPrbw&t=400s in the video already linked below. And yes, I had to type in that listing.
by steve1977
3/7/2026 at 6:55:27 PM
There is definitely bloat. A few months ago I was messing about with making a QWERTY piano in a web page, and it was utterly unplayable due to the bloat-induced latency in between the fingers and the ears.by jstanley
3/8/2026 at 12:26:35 AM
I wouldn't call that bloat; certainly we've been complaining about software bloat as long as I've been into computers, but at that time, software was simply pushing the capabilities of the hardware, and often running into walls.These days, we value developer productivity over performance optimization, so we have stuff like Electron apps. The reason behind it is that CPUs (and RAM quantity, for the most part) are so far ahead of regular desktop applications that it doesn't matter. In the 80s and 90s, the hardware could barely keep up with decently-optimized software that wanted to do anything interesting.
by kelnos
3/8/2026 at 11:53:19 AM
Who is this 'we'?The vast majority have this foisted upon them by a minority of managers/shareholders.
by benj111
3/7/2026 at 5:28:54 PM
> SSDs booted faster and launched programs faster and were a very nice change, but they weren't that same sort of night-and-day 80s/90s era change.For me they were.
I still remember the first PC I put together for someone with a SSD.
I had a quite beefy machine at the time and it would take 30 seconds or more to boot Windows, and around 45s to fully load Photoshop.
Built this machine someone with entirely low-end (think like "i3" not "Celeron") components, but it was more than enough for what they wanted it for. It would hit the desktop in around 10 seconds, and photoshop was ready to go in about 2 seconds.
(Or thereabouts--I did time it, but I'm remembering numbers from like a decade and a half ago.)
For a _lot_ of operations, the SSD made an order of magnitude difference. Blew my mind at the time.
by nucleardog
3/7/2026 at 7:36:32 PM
SSDs came out after CPUs started to slow down on doubling (single threaded) performance every 12-18mo or so.So it was the only way to get that visceral improvement in user experience like CPU and platform upgrades were in the mid 90's to very early 00's.
The experience of just slapping a new SSD in a 3 year old machine was similar to a different generation of computer nerds.
Nothing could really match the night and day difference of an entire machine being double to triple the performance in a single upgrade though. Not even the upgrade from spinning disks to SSD. You'd go from a game being unplayable on your old PC to it being smooth as butter overnight. Not these 20% incremental improvements. Sure, load times didn't get too much better - but those started to matter more when the CPU upgrades were no longer a defining experience.
by phil21
3/7/2026 at 5:49:31 PM
Sure, but what about once Photoshop was open? Aka where you spend most of your day after you start up your stuff?Would you take the SSD and a 500Mhz processor or a 2Ghz dual-core with a 7200k or 10000k HD? "Some operations are faster" vs "every single thing is wildly faster" of the every-few-years quadrupling+ of CPU perf, memory amounts, disk space, etc.
(45sec to load Photoshop also isn't tracking with my memory, though 30s-1min boot certainly is, but I'm not invested enough to go try to dig up my G4 PowerBook and test it out... :) )
by majormajor
3/7/2026 at 10:17:08 PM
Nah I agree with him. Spinning disks were always a huge bottleneck (remember how long MS Word took to open?) and SSD's basically fixed that overnight. The CPU advancements were big, but software had a chance to "catch up" (i.e. get less efficient) because they it was a gradual change. That didn't really happen with SSDs because the change was so sudden and big.I'd say software never really "caught up" to the general slowness that we had to endure in the HDD era either. Even my 14 year old desktop starts Word in a few seconds compared to upwards of 60s in the 90s.
The closest I've seen is the shitty low end Samsung Android tablet we got for our kids. It's soooo slow and laggy. I suspect it's the storage. And that was actually and upgrade over the Amazon Fire tablet we used to have which was so slow it was literally unusable. Again I suspect slow storage is the culprit.
by IshKebab
3/7/2026 at 8:59:16 PM
C64 1982 Amiga 1985Never witnessed anything before or after with that jump in specs
by dep_b
3/8/2026 at 2:42:36 AM
I was a PC gamer in the late 90s. It was very expensive. Nowadays you can build a nice rig and you can be sure to play all the latest games for 5 years.by PearlRiver
3/7/2026 at 4:51:47 PM
> Discord running on a modern computer isn't any more responsive, if not less responsive than an ICQ client was running on a computer 25 years ago.The only thing more impressive that hardware engineers' delivering continuous massive performance improvements for the past several decades is software engineers' ability to completely erase that with more and more bloated programs to do essentially the same thing.
by gavinsyancey
3/7/2026 at 5:21:16 PM
You joke, but it really is more work. Iv'e developed software in languages from assembly language to JavaScript, and for any given functionality it's been easier to write it in RISC assembly language running directly than to get something working reliably in JavaScript running on a framework in an interpreter in a VM in a web browser, where it's impossible to reliably know what a call is going to do, because everything is undocumented and untested.One of the co-signers of the Agile Manifesto had previously stated that "The best way to get the right answer on the Internet is not to ask a question; it's to post the wrong answer." (https://en.wikipedia.org/w/index.php?title=Ward_Cunningham#L...) I'm convinced that the Agile Manifesto was an attempt to make an internet post of the most-wrong way to manage a software projects, in hopes someone would correct it with the right answer, but instead it was adopted as-is.
by dlcarrier
3/7/2026 at 11:05:18 PM
Even with older lower level languages like C and COBOL '02 it's easier to do simple things like find a file, read the file, and draw the file on the screen as a raster image using a resizable canvas than it is to write the JavaScript to do the same thing.The mangling of JavaScript to fit through every hole seems to be the biggest mistake made in modern programming, and I'm not sure what even keeps it going aside from momentum. At first it regained ground because Flash was going EOL, but now?
by Tanoc
3/9/2026 at 12:43:33 AM
COBOL is a high-level language, and on the higher end to boot, with support for object oriented programming being added over 20 years go. C is also high level, but the joke is that it "combines all the elegance and power of assembly language with all the readability and maintainability of assembly language". (http://catb.org/jargon/html/C/C.html)Anyway, C doesn't support any of those things you mentioned, or even functionality as basic as memory allocation, but what it does have is a user-base that so consistently uses the same library for most functionality that it has earned its moniker as the "C standard library" and it's usually conflated with the language itself.
JavaScript, on the other hand, has more frameworks than there are programming languages in common use.
by dlcarrier
3/8/2026 at 11:56:52 AM
So if c is easier. Why are we using JavaScript???by benj111
3/9/2026 at 12:49:16 AM
Who knows?It's not even ambiguous; JavaScript uses syntax inherited from C, so if you can program in JavaScript, you can program in C, where you get a performant, stable, and simple standard library, instead of the framework-of-the-month club in JavaScript.
by dlcarrier
3/7/2026 at 10:53:34 PM
What makes Agile the most-wrong way to manage, in your opinion? I'm curious.by dale_glass
3/8/2026 at 3:13:07 AM
I think the poster was partly being facetious about being corrected on the internet.Agile has turned out to be terrible in most organisations I’ve worked at. From officious PMs demanding stupidly mismatched sprints vs the deliverables, to the use of story points as a baseball bat of invented malarkey instead of proper estimation. The useless ceremonies that aren’t tuned to the state of the project. The fantasy land of ill specified tickets and terrible business analysis landing on developers.
Not all of that is purely due to agile but the ephemeral nature of sprints seem to encourage terrible behaviour from the non technical parts of the project management cycle.
I hate it.
by smackeyacky
3/7/2026 at 7:00:23 PM
What’s the most complex thing you wrote in RISC assembly?by iknowstuff
3/8/2026 at 12:22:21 AM
> In my experience, SSDs had a bigger impact.When SSDs became mainstream, yes, I agree they had a bigger impact than any CPU speed increases at that particular time.
But back in the double-digit MHz days of CPU speeds, upgrading your CPU was king when it came to better performance, and I'd argue that effect was more pronounced than the the HDD to SSD transition was. It's hard to convey what huge jumps CPUs were making during that time period, and how big a difference it made.
I also remember a time, somewhere in the middle of that, when adding more RAM could be a bigger boost than a CPU upgrade. But back in the 80s and 90s (and prior, but I have no personal experience with that), there was only so much RAM you could add, and the CPU was still often what was holding you back.
But CPUs just haven't been the bottleneck for most home user workloads for a long time now. These days when I buy a new laptop, I certainly want the best CPU I can get, but I'm more concerned about how much RAM I can put in it, and the iGPU's specs. (SSDs are a given, so I don't need to think much about it.)
by kelnos
3/7/2026 at 4:32:18 PM
> Discord running on a modern computer isn't any more responsive, if not less responsive than an ICQ client was running on a computer 25 years ago.I feel this. Humanity has peaked.
by vachina
3/7/2026 at 6:49:09 PM
Every time Discord updates (which is often) I'm like "cool, slightly more code to run on the same hardware..."by accrual
3/7/2026 at 5:39:14 PM
Agree 100%. the compute was always bottlenecked by insanely high i/o latency. SSDs opened up fast computers like no processor ever did.by beastman82
3/7/2026 at 7:53:46 PM
Eh. In the 1980s and 1990s, the capabilities of the software you could run on your new computer were changing dramatically every two years or so. Completely new types of computer games and productivity software, vastly improved audio and video, more and more real-time functionality.Nowadays, you really don't get these magical moments when you upgrade, not on the device itself. The upgrade from Windows 10 to Windows 11 was basically just more ads. Games released today look about as good as games released 5-10 years ago. The music-making or photo-editing program you installed back then is still good. Your email works the same as before. In fact, I'm not sure I have a single program on my desktop that feels more capable or more responsive than it did in 2016.
There's some magic with AI, but that's all in the cloud.
by lich_king
3/8/2026 at 10:24:13 AM
everything is less responsive than it used to be. Every single thing.by vrighter
3/7/2026 at 6:02:07 PM
I mean, HDD were much faster than floppy disks. Which were in turn much faster than tape cassettes. And so on...by steve1977
3/7/2026 at 4:50:36 PM
This is silly. That's like saying that machines haven't gotten any better because a helicopter doesn't eat any less hay than a horse did.by idiotsecant
3/7/2026 at 10:00:56 PM
Debian Sarge, Kopete with KDE3: 256M of RAM, AMD Athlon 2000.Windows 11, Discord: 4GB are not enough to run it well.
FYI, Kopete allowed inline LaTeX, Youtube videos (low res, ok, 480p maybe, but it worked), emoticos, animations, videoconference, themes, maybe basic HTML tags and whatnot. And it ran fast.
by anthk
3/7/2026 at 5:23:51 PM
I don't follow your analogy. Can you elaborate?by dlcarrier
3/7/2026 at 3:41:45 PM
> The Megahertz Wars were an exciting time.About a week ago, completely out of the blue, YouTube recommended this old gem to me: https://www.youtube.com/watch?v=z0jQZxH7NgM
A Pentium 4, overclocked to 5GHz with liquid nitrogen cooling.
Watching this was such an amazing throwback. I remember clearly the last time I saw it, which was when an excited friend showed it to me on a PC at our schools library. A year or so before YouTube even existed.
By 2005, my Pentium 4 Prescott at home had some 3.6GHz without overclocking, 4GHz models for the consumer market were already announced (but plagued by delays), but surely 10GHz was "just a few more years away".
by st_goliath
3/7/2026 at 6:53:37 PM
IIRC, part of the GHz problem is that very long pipelines like that of the Pentium 4 tend to show increasing benefits at higher clocks. If you can keep the pipeline full then the system reaps the benefits. Sort of like a drag racer - goes very fast in a straight line but terrible on corners.But with longer pipelines comes larger penalties when the pipeline needs to be flushed, so the P4 eventually hit a wall and Intel returned to the late Pentium 3 Tualatin core, refining it into the Pentium M which later evolved into the first Core CPUs.
by accrual
3/7/2026 at 4:09:08 PM
only just last year did someone goose a PC cpu to 9.13ghzhttps://www.tomshardware.com/pc-components/cpus/core-i9-1490...
by fnord77
3/7/2026 at 3:47:29 PM
> Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs."Bananas" core-counts gave me the same experience. Some year ago I moved to Ryzen Threadripper and experienced similar "Wow, compiling this project is now 4x faster" or "processing this TBs of data is now 8x faster", but of course it's very specific to specific workloads where concurrency and parallism is thought of from the ground up, not a general 2x speed up in everything.
by embedding-shape
3/7/2026 at 4:50:37 PM
That wasn't how it worked.Up until the 486, the clock speed and bus speed were basically the same and topped out at about 33MHz (IIRC). The 486 started the thing of making the CPU speed a multiple of the bus speed eg 486dx2/66 (33MHz CPU, 66MHz bus), 486dx4/100 (25MHz CPU, 100MHz bus). And that's continued to this day (kind of).
But the point is the CPU became a lot faster than the IO speed, including memory. So these "overdrive" CPUs were faster but not 2-4x faster.
Also, in terms of impact, yeah there was a massive incrase in performance through the 1990s but let's not forget the first consumer GPUs, namely 3dfx Voodoo and later NVidia and ATI. Oh, Matrox Millenium anyone?
It's actually kind of wild that NVidia is now a trillion dollar company. It listed in 1998 for $12/share and adjusted for splits, Google is telling me it's ~3700x now.
by jmyeet
3/7/2026 at 9:51:59 PM
You got your multipliers backwards with the 486dx. The multipliers was on the CPU core rather than the bus. A dx2 was twice the memory bus speed. The dx4 was (confusingly) three times the bus speed. So a 486dx4/100 was a 33MHz bus with a 100MHz core.by giantrobot
3/7/2026 at 4:35:24 PM
I still remember my first CPU with a heatsink. It seemed like a temporary dumb hack.by rr808
3/7/2026 at 7:02:42 PM
Well it kinda was! seeing how power efficient iPhone chips are despite hovering the top of single core benchmarks.by iknowstuff
3/8/2026 at 1:44:56 AM
Nice, I didn't think of that. I love the fanless trend in Apple macbook airs.by rr808
3/7/2026 at 6:09:13 PM
I had the same inclination back in the 90s when I upgraded my Cyrix 486 SLC2 50MHz without a heat sink (which seems like a no-no in retrospect) to Cyrix MediaGX 133MHz. The stocker fan was immediately noticeable. I thought I had done something wrong.by oso2k
3/7/2026 at 6:29:01 PM
Upgrading and Repairing PCs 4th edition even says directly, that some shady resellers will put a heatsink on a chip that they're running beyond spec, but that Intel designs all their processors to run at rated speed without one.by myself248
3/7/2026 at 11:58:20 PM
I had a PC with an old PII or PIII cartridge.The cpu and heatsink was fully integrated into what looked like a NES cart, with an integrated fan and everything. It was not really possible to separate the cpu and the heatsink as the locking mechanism to keep the cart in place on the motherboard interfaced with the heatsink assembly.
So I'm a little dubious of that no-heatsink claim.
by beAbU
3/7/2026 at 7:51:38 PM
I've never seen a Xeon without a heat sink, I don't believe they are designed to run without one.by SoftTalker
3/7/2026 at 10:03:26 PM
Indeed, even the oldest, slowest Xeons shipped in SECC cartridges with integrated heatsinks.But that was several years after the book cited by the GP was published (1994, shortly after the release of the original Pentium).
by jasomill
3/8/2026 at 5:55:20 PM
Ah I missed that on my first read, was more focused on the claim at the end of the sentence. Thanks.by SoftTalker
3/7/2026 at 9:43:36 PM
The first Xeon looks to be released 1998, so sounds about rightby rr808
3/7/2026 at 3:20:40 PM
SSDs were such a revolution though, and a really rewarding upgrade. I'd fit SSDs to friend and family computers as an upgrade.by HPsquared
3/7/2026 at 3:45:08 PM
Getting my first SSD was absolutely the best computer upgrade I've ever bought. I didn't even realise how annoying load times were because I was so used to them and coming from C64s and Amigas even spinning rust seemed fairly quick.It took a long time before I felt a need to improve my PC's performance again after that.
by micv
3/7/2026 at 4:02:19 PM
There were quite a few mind blowing upgrades back in the day. The first sound card instead of PC beeper was one of my most memorable moments.I remember loading up Doom, plugging my shitty earplugs that had a barely long enough cable and hearing the “real” shotgun sound for the first time. Oo-wee
by coffeebeqn
3/7/2026 at 3:35:07 PM
I once had a decade old Thinkpad that suddenly became my new work laptop once more thanks to an SSD. It's a true shame they simply don't make them like this anymore.by sigmoid10
3/7/2026 at 7:41:55 PM
I owe much of my career to an SSD. I had a work laptop that I upgraded myself with an 80GB Intel SSD, which was pretty exotic at the time. It was so fast at grepping through code that I could answer colleagues’ questions about the code in nearly real time. It was like having a superpower.by patwolf
3/7/2026 at 3:56:07 PM
Just before I installed an SSD was the last time I owned a computer that felt slow.by dcminter
3/7/2026 at 8:22:21 PM
When Alder Lake finally made a sizable jump, I looked at decades of old tests I'd done along the way with CPUs and tried to bridge them together reasonably.Between IPC (~50 to 100-fold improvement) and clock speed increases (1000-fold alone), I estimated that single-thread performance has increased on the order of 50,000x - 100,000x since the 4.77 MHz 8088.
In human terms this is like one minute compared to one month!
by OldSchool
3/7/2026 at 6:39:05 PM
I think the single biggest jump I ever experienced was my first dedicated GPU — a GeForce 2 MX if I'm not mistaken.by pdpi
3/7/2026 at 7:55:07 PM
I remember our school getting new computers to replace the 233Mhz G3 iMac computer lab during the Megahertz Wars and the vice principal announcing the purchase of new "screaming fast" 600 Mhz Dell Optiplex GX100. The nice thing is that the G3 iMacs then got pushed out to the classrooms, but it was sad to see Apple lose the spot in the lab. I miss the wonder of playing Pangea Software games for the first time like Bugdom and Nanosaur.by iwontberude
3/7/2026 at 6:17:18 PM
Agreed. That was the next big boost! I installed my first SSD in this HP workstation-grade laptop that we got "for free" from college. It was like getting a brand new computer! In fact, I ended up giving that computer to my sister who ran it into the ground.I didn't feel any huge speed boosts like that until the M1 MacBook in 2020.
by nunez
3/7/2026 at 3:48:52 PM
GPUs for 3d graphics were a game changer.I can see why you wouldn’t consider it as impactful if you weren’t into gaming at the time.
by geon
3/7/2026 at 4:58:33 PM
I don't know. I felt this way when switching from Intel laptop to Apple M1. I am still using it today and I prefer it over desktop PC.by varispeed
3/7/2026 at 10:14:20 PM
I also went from an Intel MacBook Pro to an M1 and appreciated it, but that leap was exaggerated by how bad the last few generations of Intel MacBook Pros were.The Apple Silicon chassis was allowed to finally house an appropriate cooling solution, too. They are much quieter than the same Intel laptops when dissipating the same power levels.
by Aurornis
3/7/2026 at 5:21:33 PM
Have you ever used proper desktop computers? I suppose such a move would feel significant if you've mostly been using laptops.by embedding-shape
3/7/2026 at 6:13:26 PM
But that's the thing; a laptop is fundamentally different. Of course if there's the equivalent of a thermopump under my desk I'm going to get crazy performance. The magic was that Apple brought the uncompromised experience to a laptop.by philistine
3/7/2026 at 10:18:38 PM
> The magic was that Apple brought the uncompromised experience to a laptop.Apple’s power efficiency was a great bump forward, but the performance claims were a little exaggerated. I love my Apple Silicon devices but I still switch over to a desktop for GPU work because it’s so much faster, for example.
Apple had that famously misleading chart where they showed their M1 GPU keeping pace with a flagship nVidia card that misled everyone at launch. In practice they’re not even close to flagship desktop accelerators, unfortunately.
They have excellent idle power consumption though. Great for a laptop.
by Aurornis
3/7/2026 at 11:40:20 PM
Then Windows 11 came along to slow everything down and we are back to the stoneage.by s0rce
3/8/2026 at 12:11:33 AM
Got recently a new surface laptop at work - windows 11 gives me the same feeling I had from Vista. Hilarious how modern computers are more powerful than ever, but windows 11 now feels worse than windows 7 ten years ago.by ponector
3/7/2026 at 7:34:32 PM
My first pentium was clocked at 60Mhz.by kwanbix