alt.hn

4/10/2026 at 12:03:34 PM

Intel 486 CPU announced April 10, 1989

https://dfarq.homeip.net/intel-486-cpu-announced-april-10-1989/

by jnord

4/10/2026 at 2:58:32 PM

My first computer was a 486sx 25Mhz [1] The rig (tower, monitor, etc.) cost around $3,000. We got the SX instead of the DX because it was $500 cheaper. And I wanted a 16bit sound card. (Note that this is in 1992 dollars. Today it would cost over $7,000)

My parents didn't have a lot of money, but my great-grand father passed and they used some of the inheritance to buy the computer. I was instantly hooked. In hindsight I see how much of a gift my family gave me.

The announcement reminded me of article John Dvorak wrote around the same time. 1GB hard drives had just come out, and he asked what all the extra space would be used for. Even as a young teenager, I remember thinking how short sighted that comment was. That was before I realized how the tech press tends to get stuck in local optimizations, and can't understand the bigger picture.

It's all a good reminder that cutting edge today doesn't stay cutting edge very long, and the world figures out how to squeeze every ounce ounce of power out of hardware. (Also, yes, that leads to bloat...)

[1] https://en.wikipedia.org/wiki/I486SX

[2] https://en.wikipedia.org/wiki/John_C._Dvorak

by wiremine

4/10/2026 at 3:21:08 PM

> In hindsight I see how much of a gift my family gave me.

True for many, many of us, I suspect. My family bought a 286 in the early 90s and it cost something like $2000 CAD then, which is nearly $4000 now; but salaries were lower then, this would have been something like 5-6% of my single income family's yearly post-tax earnings for the year, and if you think about it as the % of "disposable" income it was probably more like 60% of it for the year.

Obviously it paid off in that it set me on the path for my career, hard to make any other investment as good as that, but who would have known that at the time? I'm glad that there were so many ads positioning computers as being educational and not just game machines; even though in reality I think it was learning about the computer to make the games work that taught me way more than any educational software ever did.

by mikestorrent

4/11/2026 at 12:41:47 AM

Ha! Same for me: 286 in 9th grade (1990) for about $2k CAD. 286 was a bad call though as I think it was harder to expand compared with 386. I remember 1MB RAM but really only 640k usable. Had to change some BIOS settings to get to about ~700 kB?

by amoorthy

4/10/2026 at 4:09:11 PM

similar, but I got the 486 DX2-66.

I’ve been thinking a lot about these inflation-adjusted prices due to the big Apple Computer anniversary — an Apple // cost $5000 in 2026 dollars, meanwhile a $600 Macbook Neo cost $150 in 1980 cash!

What helped me reconcile this was an observation that we’ve inverted the prices of necessities and luxury goods. Rent and mortgage in particular were a much smaller slice of income back then, but luxury goods were very expensive, so one would save up for a year or two to buy a new TV or a computer for the kids.

Now the necessities take a much larger slice of our income, but TVs and computers are incredibly cheap. It takes very little money to get a nice computer, and not-buying it barely makes a dent in the bills. This isn’t a good thing.

I do disagree a little with your observation regarding the industry “squeezing every ounce of power out of hardware”. Beyond local LLM stuff, there’s basically nothing a modern computer can comfortably do that any laptop since the mainstreaming of SSDs can’t.

by Eric_WVGG

4/11/2026 at 1:15:21 AM

Audio, video, and 3D animation are still extremely processor intensive. You need something beefy if you're serious/professional about those.

Office tools and web browsing are less demanding.

by TheOtherHobbes

4/11/2026 at 2:03:15 AM

> You need something beefy if you're serious/professional about those

But you can get way better results with the lowest end computers than you could years ago. Back in the 90s my grandfather used 3DS Max to map out his future apartment's rooms and start planning furniture, using renders to get an idea of how sunshine would look like at different times, etc. At the time, he did this on an expensive 486 that would take an entire day to render some of those visuals. Nowadays I can do the same with a free copy of Blender and any reasonably modern integrated GPU in probably under an hour.

by koutakun

4/11/2026 at 3:41:30 AM

"Nowadays I can do the same with a free copy of Blender and any reasonably modern integrated GPU in probably under an hour."

Try seconds or at most minutes.

by vardump

4/11/2026 at 7:41:35 AM

John Dvorak has tons of short sighted articles.

I wanted to link his columns "Microsoft Dot Nyet" and "New Architecture Needed" from circa 2000-2001 but it turns out they have been memory-holed. They should be somewhere in the wayback machine.

EDIT: At least one of them has not been deleted, just his name has been removed

https://www.pcmag.com/archive/new-architecture-needed-32570

by bananaflag

4/11/2026 at 8:47:11 AM

Yikes, you're not wrong. And I guess he's never heard of security issues, what with his ROM idea. Neat for a console (where the ROMs are game cartridges, as they used to be) or an appliance not connected to the internet, not a general-purpose OS...

Pretty much the only thing I agree with is that computer architecture could use a complete rework (both from a software as well as hardware side, though primarily the former); as well as said rework being basically impossible in practice.

by DarkUranium

4/10/2026 at 5:09:53 PM

> In hindsight I see how much of a gift my family gave me.

Gotta tack on to this thread showing appreciation for parents. We could never afford new computers in the 90s, but luckily my dad could bring home obsolete equipment from work. We were thus always at least a generation behind. I remember my friend's Pentium feeling like sci-fi compared to our 386, but my goodness it completely molded my life!

Later, towards the end of the 90s, those sci-fi Pentiums were obsolete, so I got a few to run "that weird Linux stuff" on. Since it was considered junk, nobody cared what I did with it. To this day, if I happen to hear Metallica play and there's early winter's first smell of snow in the air, my mind will be transported back to that school night I secretly stayed up wayyy too late and discovered SSH for the first time. Haven't looked back.

Thank you, dad! I just hope general computing devices owned by regular people are still natural by the time my children come of age.

by gspr

4/11/2026 at 1:53:51 AM

My grade school friend got a Nintendo and I wanted one so badly. My parents got me an Apple IIGS instead. I was a little disappointed about the Nintendo, but saw there were plenty of games on the thing, and of course it could do so much more than play games. That turned out to be a very good move on their part.

by wat10000

4/11/2026 at 1:54:28 AM

My mother was a stenographer. She used a 286 for processing docs. That baby wasssss alll mine during the day!!! All my friends had hacks for sys/bat/exe files to get wolfenstein at least to load. Best days of my life.

by ransom1538

4/11/2026 at 1:26:31 AM

> My parents didn't have a lot of money ...

Mine neither although the grandparents were moderately wealthy but my mom understood very early on that it was a match for me and that computers would really take off.

Fun story: first BASIC I ever got was an Atari 2600 cartridge that came with some key of a "keyboard" in two parts you'd plug in the joystick ports. When my parent bought that Atari 2600 they tried it and spent the entire night playing "Tank Attack" on the TV in their bedroom. She only told me that years later.

Then as I was writing tiny BASIC programs on the Atari 2600 gaming console, she realized I needed a "real" computer, so she bought me an Atari 600 XL a bit later. Then I began salivating on the neighbours' Commodore 64, which I could see trough a window. And she thought: "If I buy the exact same computer as the neighbours, maybe my son and the neighbours shall become friends!". 42 years later one of our neighbor just went to visit my brother in another country and his brother we exchange Telegram messages nearly daily.

Then the Amiga. Then the 386, 486, etc.

What a mom. RIP.

by TacticalCoder

4/11/2026 at 10:02:03 AM

Back in the day I couldn't even dream of a PC. They were way too expensive. It took my extended family chipping in (~15 people) to buy me a C64 with tape storage. Still it was great fun. It made me learn programming in BASIC and English at the same time (as the Polish language book included was so badly translated and full of errors it was hopeless).

It was pre-internet obviously so obtaining software was very difficult. For years when I was learning assembler I was using a so called "monitor cartridge" that did simple assembly/disassembly, but it didn't support labels and such. I could read about software like "Meta Assembler" that let you use labels and variables and think "wow, I could do so much stuff with that..."

My first PC was sometime in late 90s. A Celeron 233MHz with Windows 95. I wasn't a huge fan of Windows back then. I remember when one of the pc magazines I got had RedHat Linux install CDs. I liked it from the start. The fact my software only modem and Lexmark printer didn't work got me into kernel programming :-)

Fun to think of it now, but I prefer 2026 a 100x :-)

by Roark66

4/10/2026 at 12:20:38 PM

The 486 killer app was DOOM. It was butter-smooth at 20 fps if you also had a VLB graphic card.

The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.

by fabiensanglard

4/10/2026 at 12:48:08 PM

The DX/2 66 is a true legend of a chip. It was so good. The final nail in the coffin for the Amiga and for 68k. I love the Amiga, but it just didn’t Doom.

Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.

Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.

by einr

4/10/2026 at 1:31:21 PM

I remember arguments (and benchmarks) around all the variations of the 486 since the bus speed/clock speed was uncoupled (the /2 is clock doubling). For some applications, a 50Mhz 486 with a 50Mhz bus would beat a DX/2 66Mhz with a 33Mhz bus.

And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.

by bombcar

4/10/2026 at 1:46:19 PM

Nearly correct. The DX/4 100MHz had a 33MHz bus. The DX/4 75MHz had the 25MHz bus. I remember well because I had both.

by Sheeplator

4/10/2026 at 1:52:01 PM

Now I remember being annoyed that it wasn't the DX/3 as it should have been!

by bombcar

4/10/2026 at 1:59:26 PM

Especially since when actual clock quadrupled chips eventually came out they had to call themselves ridiculous things like ”5x86” instead of DX/4. (The Am5x86 133 runs at 4x33 MHz)

by einr

4/10/2026 at 3:01:31 PM

I think 5x86 had more to do with marketing than anything else, because the Pentium had already been on the market for a while when the Am5x86 came out.

by cout

4/10/2026 at 3:16:29 PM

I think it’s a bit of both. It absolutely tried very hard to pretend that it was a ”586” (Pentium class) but also ”5x” is right there and implies that if the DX4 is 4, this is 5.

The full name on the chip on some of them is ”Am5x86-P75 DX5-133” which implies a lot of things, some of which are flat out misleading (it does not get very close to ”P75” performance)

by einr

4/11/2026 at 7:45:37 AM

I had one of these back on the day. A very fine 486

by Zardoz84

4/10/2026 at 7:24:14 PM

I remember being so excited when I figured out how to jumper my DX/4 100 and operate it with clock doubling and a 50 MHz front side bus speed. Same core speed, faster memory and I/O.

My peripherals seemed to take it. My graphics output showed some slight glitches, which I was OK with for the speed.

However, I think it was a bit unstable and would fail a correctness challenge like compiling XFree86 or the Linux kernel, which were like overnight long runs. Must have been some bit flips in there occasionally. I seem to recall that once that reality settled into my brain, I went back to the clock tripler config.

by saltcured

4/10/2026 at 8:24:50 PM

I still remember scribbling on Athlons with a pencil to max them out - we probably spent as much on heatsinks as we saved on CPUs.

by bombcar

4/10/2026 at 12:54:16 PM

As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.

The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.

1: https://news.ycombinator.com/item?id=47717334

by whizzter

4/10/2026 at 1:12:18 PM

> The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).

by feintruled

4/10/2026 at 2:14:30 PM

Ah, I was under the impression that it had a native chunky mode but it was a built-in C2P routine? Anyhow, seems it was useful (1) when running on stock CD32's but not in conjunction with faster machines.

1: https://forum.amiga.org/index.php?topic=51616.msg544232#msg5...

by whizzter

4/10/2026 at 2:34:43 PM

Which brings me to my pet peeve, the already slow 68020 (680ec20) at 14MHz was crippled by, even though it had a 32-bit bus, was only connected to a 16-bit RAM bus. (Chipram.)

This 16-bit memory (2 megs) is also where the framebuffer and audio lives, so the stock CPU in A1200 has to share bandwidth with display signal generation and the graphics and audio processing.

All-in-all, it meant the Amiga 1200 had only about twice the memory throughput of the Amiga 500. (About 5 megabytes/s vs about 10 megabytes/s)

If the A1200 had at least some extra 32-bit memory (it existed as a third party add-on) the CPU could have had its own uncontested memory with a troughput of about 20-40 megabyte/s.

Imagine the difference it would have made if the machine had just a little extra memory.

That's just a tiny detail. That the chipset wasn't 32-bit was another disappointment.

The bigger problem was that Commodore as a company was aimless.

by actionfromafar

4/10/2026 at 2:49:18 PM

Yeah, and it took ~7 years to make those marginal improvements over the earlier Amiga chipset! I'm ignoring ECS, since it barely added anything over OCS for the average user.

by icedchai

4/10/2026 at 12:57:03 PM

The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…

by einr

4/10/2026 at 2:04:39 PM

Was it necessarily a dead end? Considering the ways Intel and later AMD managed to upgrade/re-invent x86 that until x64 still retained so much of the x86 instruction encoding/heritage (heck, even x64 retains some of the instruction encoding characteristics).

Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.

by whizzter

4/10/2026 at 2:39:14 PM

The 68060 is pretty good to be fair, but it never ended up being widely used and Motorola definitely saw PPC as the future.

Maybe if these theoretical new 68k Amigas became a huge market hit they could have taken the arch further and it could have remained competitive, but all the other 68k shops had already pretty much given up or moved on already (Apple was already going PPC, Sun went SPARC, NeXT gave up on their 68k hardware, Atari was exiting the computer business entirely, etc) so I don’t know that the market would have been there to support development against the vast amount of competition from both the huge x86 bastion on one hand and the multitude of RISC newcomers on the other.

by einr

4/10/2026 at 2:22:35 PM

The argument is that 68k is "CISCier" than x86, the addressing modes in particular, so making a performant modern out-of-order superscaler core that uses it would be harder than x86.

by fredoralive

4/10/2026 at 2:44:15 PM

I believe in that. But Commodore could have plunked a cheap 68020 in their machines for backwards compatilibity (like how MSX2 had a SOC MSX1 inside, PS2 had a PS1 SOC, PS3 had a PS2 SOC, and so on) and put another "real" socketed CPU as a co-processor. Or made big-box machines with CPUs on PCI cards, for infinite expansion options. "True" multitasking, perfect for CAD, 3D rendering and non-linear video editing. It would have been very cool with an architecture where the UI could be rendered with almost hard realtime and heavy processing happened elsewhere.

by actionfromafar

4/10/2026 at 2:51:16 PM

This is almost exactly what the plan was, until C= went out of business:

https://en.wikipedia.org/wiki/Amiga_Hombre_chipset

It was going to be HP PA-RISC based and have an AGA Amiga SoC, including a 68k core.

by einr

4/10/2026 at 7:02:24 PM

How much of Hombre is myth-and-legend? Given how little progress with made with OCS->ECS->AGA, it seems unlikely they could even have built an Amiga SoC, nevermind designed a new 64-bit chipset.

by icedchai

4/10/2026 at 1:09:55 PM

There were no tech problems IMHO, it was all mgmt problems. They could have chosen a handful of completely different (edit: mutually exclusive even!) tech paths and still have won, but instead they chose to do almost nothing except bleeding the company dry.

Edit: I don't mean that their success was certain if they executed better. I mean they did almost nothing and got the guaranteed outcome: failure. (And their engineers were brilliant but had very little resources to work with.)

by actionfromafar

4/10/2026 at 1:31:49 PM

Commodore so slowly and ineffectually improving on the OCS didn't help, but the original sin of the Amiga was committed in the beginning, with planar graphics (i.e., slow and hard to work with, even setting aside HAM) and TV-oriented resolutions/refresh rates (i.e., users needing to buy a "flicker fixer"). It's like they looked at one of the most important reasons for the PC and Mac's success—a gorgeous, rock-solid monochrome display—and said "Let's do exactly the opposite!"

by TMWNN

4/10/2026 at 1:51:07 PM

Iirc interlaced display and 6 bitplanes were a compromise to allow color graphics in 1985 with the memory bandwidths available at the time.

If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.

By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.

by whizzter

4/11/2026 at 6:06:30 AM

5 sins in 1992: - 8 bit planar instead of chunky - progressive display (vs interlaced) - sound was not 16-bit - should have been 68030 with mmu support (vs 68020ec) - HD mandatory.

If they addressed this, the Doom experience would have run better on Amiga.

by smallstepforman

4/10/2026 at 12:52:55 PM

At that point in time I would not have called it Wintel yet. That started after Windows 95, IIRC.

by darkwater

4/10/2026 at 1:03:24 PM

Yep. 486DX/2 was when I started seriously looking at moving on from the Amiga. I wound up with a DX/4 100 sometime in 1994.

by icedchai

4/10/2026 at 12:55:30 PM

My classmate kept his Amiga 1200 a bit longer! ...eventually he got a PC with Pentium 60 MHz.

by gattr

4/10/2026 at 12:58:47 PM

Yeah, there were holdouts of course but the DX/2 really seems like the breaking point.

(Also, a Pentium 60 is barely faster than a DX/2 66 at many tasks — it is a Bad Processor — but that’s another conversation ;)

by einr

4/10/2026 at 1:43:04 PM

Pentium is a bad processor? It's way faster than 486, especially on FP it's not even close.

by Synaesthesia

4/10/2026 at 2:01:53 PM

The original Pentiums (socket 4, 60 or 66 MHz) had the infamous floating point division bug, had underwhelming perf for anything not FP bound (most things), ran hot, and were too expensive for what you got. A DX/4 100 was nearly always a more rational choice.

Second gen Pentiums, starting with the 75 MHz, were great.

by einr

4/10/2026 at 3:26:37 PM

I had a P60 that had the F0 0F bug; Windows would crash for weird reasons on it, but Linux ran like a champ because it actually had a workaround. Luckily my chip was already recalled for the FDIV bug so it wasn't a total boat anchor. Loved that machine. I had BeOS, QNX, and one time I made Linux look like Solaris with all the Open Look stuff - really enjoyed that aesthetic.

Now we have these amazing displays and graphics cards and there's literally no way to make my Mac have different window titlebars or anything. So boring

by mikestorrent

4/11/2026 at 7:52:56 AM

Didn't do you try again Linux recently?

by Zardoz84

4/10/2026 at 3:12:55 PM

Idk if the 75 was really that great tho, mostly in that it had a 50Mhz FSB rather than 60 or 66Mhz like most other parts.

Another factor for the later P1s being better IIRC was improved chipsets.

by whaleofatw2022

4/10/2026 at 3:48:34 PM

We had a 90 overclocked to 100Mhz that served as the family computer, I inherited from it when the family computer was upgraded to a K6 II and it chugged along as my personal computer until ~2001 thanks to Linux whike the Ghz barrier had been broken for a while already in the Intel world.

I think my next computer came with an AMD Duron 900Mhz, an entry level at the time but the jump from the pentium 100Mhz was such a huge gap it still felt like a formula 1.

by prmoustache

4/10/2026 at 3:21:32 PM

To be more exact, I think the first great Pentium was the 133, but the 75 is the first that was a real, proper jump in performance from a fast 486 and represented decent price/performance.

by einr

4/10/2026 at 3:09:42 PM

It didn't help that the earliest P5 Pentiums ran on a 5V rail. Newer revisions starting with the P54 core used 3.3V and helped with keeping the chips cool.

by TheAmazingRace

4/10/2026 at 1:47:59 PM

The Pentium was great, but the 60 and 66MHz versions were not liked, they ran way too hot.

by Sheeplator

4/10/2026 at 6:38:42 PM

I think from the price people also expect a similar performance boost as going from 386 to 486. What made Pentium also confusing is that during this time Intel introduced PCI.

From a 486 with VLB to a Pentium with PCI everything became a lot nicer.

by phicoh

4/10/2026 at 2:04:43 PM

Many tasks perhaps, but running Quake was not one of them.

by Sharlin

4/10/2026 at 2:10:18 PM

Yeah, it does alright and is a significant difference to a DX/2, but Quake came out in ’96 and the P60 came out as a super expensive workstation class CPU in ’93. If you were a gamer in ’96 it is unlikely you were rocking a P60 because it was not ever good value for money.

by einr

4/10/2026 at 10:01:31 PM

You could play 320x200 Quake acceptably on a P60. On a DX4 too, though barely - my family had both in the mid 90s. I'd be surprised if Quake is playable on a DX2.

by ngcazz

4/10/2026 at 2:37:02 PM

I distinctly remember having a Strike Commander poster in my bedroom saying “Strike really flies on a 486 DX/2”. Fond memories indeed.

by alex_suzuki

4/10/2026 at 12:35:11 PM

Doom was released end of '93. In 1992 most of us were in the 286 -> 386 upgrade wave and a 486-33 was easily at $2.5k+ ($5.5k in today's terms). The 486 DX2 66 was a good choice even 1994-1996.

by throwaway_20357

4/10/2026 at 3:20:20 PM

My boss then - who's still a very dear friend - purchased a work computer to play Doom. He was already mentally checked out of that job and was looking for his next opportunity. Spent a lot of time at work playing Doom and got quite good at it.

I think it was 1994. It was a loaded 486 with the best 17" CRT monitor money could buy at the time. I think he spent over $7000.

by intrasight

4/10/2026 at 1:01:29 PM

Yes, the latest chips were very expensive back then, and out of reach for most people who would continue buying new computers with older chips. (As opposed to how most people today buy an iPhone or a Mac or whatever with the latest semiconductor technology.) I got my 25MHz 386 in 1991, over two years after the 486 was announced, and I had one of the fastest computers of anybody in school... for a short time.

by simmons

4/11/2026 at 2:23:17 AM

My first Intel based PC was actually a 486DX/2-66 “Houdini” card for my PowerMac 6100/60 in late 1994. It had a SB16 daughtercard and could either share RAM with the host Mac or use a 32MB dedicated SIMM. I added a dedicate SIMM when prices dropped to $300 for it.

by raw_anon_1111

4/10/2026 at 2:16:49 PM

I wonder, I wonder where one could find a good book about the software architecture of that game… oh, well

by busfahrer

4/10/2026 at 3:37:19 PM

...and with 8 MB (-eight- for the youngsters ;-) RAM you were absolutely the king ruler :-D

by KellyCriterion

4/10/2026 at 1:53:08 PM

The 486 and https://www.delorie.com/djgpp/history.html changed everything.

Suddenly, it was possible to imagine running advanced software on a PC, and not have to spend 25,000 USD on a workstation.

by theodorethomas

4/10/2026 at 3:02:36 PM

It's hard to convey to today's generation, who think Ivy Bridge to Haswell was a big jump or whatever, how awesome the 286 -> 386 -> 486 changes were to personal computing. It felt almost like what going from a NES to a Super Nintendo to a N64 felt like. The improvements were astounding.

by ryandrake

4/10/2026 at 3:54:20 PM

It wasn't a big jump, but it was a jump. Ivy Bridge lacks the instruction set required to run RHEL 10 [1]. The minimum supported microarchitecture level is x86-64-v3 and Ivy Bridge lacks AVX2 instructions.

[1]: https://docs.redhat.com/en/documentation/red_hat_enterprise_...

by ciupicri

4/10/2026 at 3:56:37 PM

I'm surprised RHEL is requiring AVX2 models, they usually had some slack in processor requirements (though I'm sure not as big as Debian)

by raverbashing

4/10/2026 at 2:26:53 PM

I remember trying to run a game, Rise of the Triad, which was built with an improved Wolfenstein engine iirc, and having it struggle on my 386 unless I made the viewport as small as possible. At which point it told me to buy a 486... well I did eventually, I guess it worked.

by adzm

4/10/2026 at 2:54:10 PM

Had the same experience with Doom II. Got it to run surprisingly well on a brand new Tandy 486DX2 + 4MB RAM, though I seem to recall having issues with SoundBlaster compatibility.

by temporallobe

4/10/2026 at 2:06:30 PM

Amazing to see a webpage "Updated Dec 1998" still up, running and displaying correctly.

by dbdr

4/10/2026 at 2:27:13 PM

Without fancy JS or CSS, sites can last decades easily

by madduci

4/10/2026 at 3:07:40 PM

With JS and CSS sites can last decades easily.

by spankalee

4/10/2026 at 3:22:25 PM

Agreed, it's not those, it's the fact that we went from JS being a little sprinkling of dynamism on a document to an entire build process with massive numbers of dependencies and browser shims. The web feels like a mistake as a platform...

by mikestorrent

4/11/2026 at 4:42:24 AM

I said "fancy", meaning frameworks or custom things. With vanilla js everything is durable

by madduci

4/10/2026 at 2:30:03 PM

[dead]

by k4rnaj1k

4/10/2026 at 2:55:18 PM

It was really the 386 that was the beginning of modern computing, since it had a mmu.

by CyberDildonics

4/10/2026 at 4:19:45 PM

Several operating systems on 286 (eg Xenix, Coherent, OS/2) used its MMU for multitasking and memory protection. See https://en.wikipedia.org/wiki/Intel_80286#Protected_mode

by fulafel

4/10/2026 at 5:44:31 PM

The 286 protected mode did not allow for a 32-bit flat address space and was heavily half-baked in other ways, e.g. no inbuilt way to return the CPU to real mode without a slow and fiddly CPU-reset.

by zozbot234

4/10/2026 at 7:18:44 PM

It was architecturally a 16-bit CPU so a flat 32-bit address space would be a non sequitur. If you wanted flat 32-bit addressing, there was a contemporary chip that could do it with virtual memory: Motorola 68010 + the optional external MMU. (Or if you were willing to do some hoops, even a 68000.. see the Sun-1)

by fulafel

4/10/2026 at 6:59:04 PM

Coherent was the first Unix-like OS I ran, on a 386SX box. I think it was Coherent 4.x.

by icedchai

4/10/2026 at 4:36:33 PM

[dead]

by CyberDildonics

4/10/2026 at 3:19:07 PM

Except the 486 had hardware floating point, essential for technical work.

by theodorethomas

4/10/2026 at 4:25:16 PM

By the way, "the i486SX was a microprocessor originally released by Intel in 1991. It was a modified Intel i486DX microprocessor with its floating-point unit (FPU) disabled." (https://en.wikipedia.org/wiki/I486SX)

by ciupicri

4/10/2026 at 3:59:07 PM

An MMU is pretty much necessary for robust multitasking. Without it, you are at the whim of how well software behaves. Without it, it is more difficult for developers to create well behaved software. That also assumes good intentions from programmers, since an MMU is necessary for memory protection (thus security).

While emulating an FPU results in a huge performance penalty, it is only required in certain domains. In the world of IBM PCs, it was also possible to upgrade your system with an FPU after the fact. I don't recall seeing this option for IBM compatibles. While I have seen socketed MMUs on other systems, I don't know whether they were intended as upgrade options.

by II2II

4/10/2026 at 5:12:08 PM

You could buy a 8087 for your 8086 or 8088, the 486DX just moved it on chip.

by saati

4/10/2026 at 3:24:13 PM

That's an advancement but that's a matter of speed an simplicity. An MMU is a huge before and after, it's still the biggest separator of CPUs today. The most important detail to understand a CPU is whether it has an MMU.

by CyberDildonics

4/10/2026 at 12:40:06 PM

We ran a 3-line BBS (Renegade and then Wildcat) on OS/2 on a 486-33 with 12 MB RAM. This was in 1994 or so. Great way to multitask several dos applications!

by loloquwowndueo

4/10/2026 at 4:23:49 PM

486 was my dream. Unfortunately, my parents didn't have money for it. I bought my first PC in 1999 - a Pentium 2. I invested a lot of money in the monitor; computers become obsolete very quickly, while a monitor can serve for many years. Surprisingly, flat monitors appeared soon after...

by alex_be

4/11/2026 at 2:24:04 AM

Yeah but the first LCD screens sucked. Poor color rendition and not usable for gaming. In the early 2000s you were better off sticking with your CRT.

by snek_case

4/10/2026 at 1:55:48 PM

I didn't have access to a 486 until around 1999. I was making do with a hand-me-down 8088 and then a 386SX.

Back then, 10 years of technological advancement made a huge difference. Today, you can get by just fine with a 2016-era laptop.

by realreality

4/10/2026 at 10:07:19 PM

Friend of mine is still rocking a 1st gen retina MacBook Pro (from 2012) for music production!

by ngcazz

4/10/2026 at 2:50:52 PM

Funny I'm working with intel 686 right now brutal to get stuff to build eg. rust/cargo related (missing deps but mostly the hardware, slow). Recently trying to fix this maturin problem I ran into. But it is cool the backwards compatibility of python 3.11 to 32bit with debian 12

The CPU I'm working with is Celeron M 900MHz single core no HT struggling to build wheels for python (several hours)

by ge96

4/11/2026 at 3:11:58 AM

It's great Python is/was well supported on i686. Node on the other hand almost immediately started requiring SSE2 even in the earliest versions. Have not found success with Node + Pentium III yet, maybe need to build an earlier version myself.

by accrual

4/11/2026 at 5:53:43 AM

I got it to work on Intel 270 but that has HT and 1.6GHz still slow (hours to build wheels) specifically temporalio and cryptography.

Yeah node is usually my go to love JS

by ge96

4/10/2026 at 12:34:12 PM

I remember getting my first 486 33mhz computer and being able to play Ultima 7 the black gate, and later Ultima 7 part 2. This was a turning point for me as the game was way ahead of others on the console side of things. DOS 6 !

by roody15

4/10/2026 at 12:48:18 PM

I think the turning-point was that flat-framebuffers and plenty of CPU-power for the first time eclipsed specialized 2d hardware (Amiga,Megadrive, Snes, etc).

Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).

It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.

The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).

Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).

by whizzter

4/10/2026 at 3:00:29 PM

Linux kernel version 7.1 will drop support for 486: "Linux devs think even one second spent on 486 support is a second too many." https://arstechnica.com/gadgets/2026/04/linux-kernel-maintai...

by gardaani

4/10/2026 at 3:19:26 PM

>This chip was originally introduced in 1989, was replaced by the first Intel Pentium in 1993, and was fully discontinued in 2007

That's really long compared to 1yr refresh cycles we have today with phones etc.

by mayama

4/10/2026 at 6:45:17 PM

I can understand running an old 486 machine for nostalgia reasons, or because you have some old industrial equipment that relies on it and even one second spent replacing it is a second too many, but I struggle to imagine why you'd want or need to run a modern Linux kernel on it.

by wk_end

4/10/2026 at 2:57:50 PM

• Ran my first Linux at home on a i486-DX2 (33 MHz, 4 MB RAM), which supported a decent X11/R6 performance in color in 1992, with a 14" CRT.

• Ran my first real UNIX at home on a PA-RISC (HP 9000-715/75 with HP-UX 9.03 and 96 MB RAM) in 1997, 20" color CRT.

• Today, Linux is still here, but on a 2-CPU, 140-core AMD server with 2 TB RAM, hundreds of TB NAS and a 40" TFT... (and it still takes too long to open the bloated Web browser! - keenly awaiting Ladybird to the rescue in August.)

by jll29

4/11/2026 at 8:39:16 AM

I remember compiling Linux Kernel on SuSE 6.3 on a AMD 486DX5 133mhz ... good times , and I don't forget to do "make mrpropper"

by Zardoz84

4/10/2026 at 3:38:39 PM

> and it still takes too long to open the bloated Web browser! - keenly awaiting Ladybird to the rescue in August

chromium browsers launch pretty fast. If you're talking about memory usage, Ladybird isn't aimed at minimal memory usage from what I've seen.

by mayama

4/10/2026 at 6:20:50 PM

Wouldn't the DX2 be 66 MHz? Or did you intentionally run it at 33 MHz?

by TheAmazingRace

4/10/2026 at 2:04:18 PM

Hard to imagine now, but this was a huge turning point. A genuinely powerful CPU in a "Pee-Cee" available for less than RISC workstation money. I had to wait a while, mine was an AMD DX2-66 since I didn't have a budget for Intel... add Slackware... and countess hours messing with XF86config and I had a poor-mans Sun workstation.

by nickdothutton

4/10/2026 at 3:40:34 PM

Raise your hand if you have been there and:

- tinkered for HOURS to get enough EMM/XMM memory by tweaking Config.Sys & Co to get whatever game running (and having dedicated boot options configured, because you could unload some drivers from mem and could then run other games)

:-D

by KellyCriterion

4/10/2026 at 3:43:38 PM

I still have a 486 linux system from those days - has not been turned on in this century but I'll try some day together with a glas of whisky :-)

by axeldunkel

4/10/2026 at 5:28:39 PM

And of course, support for this venerable processor will be dropped in Linux kernel 7.1 in a couple of months time.

by mrlonglong

4/10/2026 at 2:42:25 PM

Microsoft was and still is the reason why average people needed more powerful chips lol, maybe with the exception of browser bloat.

by Aperocky

4/10/2026 at 1:32:01 PM

I've got one sitting on the shelf above my desk, a 33 Mhz dx, I don't even remember what machine it came out of.

by markbnj

4/10/2026 at 2:33:27 PM

I've got an AMD branded 286 chip, from my first owned-by-me PC, bluetac-ed to the case of my home desktop PC, powered by a Ryzen something-or-other from a few years ago (with a 1060/6Gb card from a few years before that because I wasn't gaming enough to justify a new graphics card along with the other updates at the time).

by dspillett

4/10/2026 at 2:08:04 PM

I too have one sitting on my desk, 486DX2 66Mhz. I've had it for probably 25 years now, bringing it from job to job like the magical lost artifact it is. I remember how much more capable it was for playing Doom and Descent than the 33Mhz, or heaven forbid the SX. Of course shortly after the Pentium came out and blew everything away. The good 'ol days of giant Gateway 2000 towers.

by JCSlim

4/10/2026 at 2:57:22 PM

I got a paper route just to get a hold of the dx2.

It was a life-changing machine.

Ordered, I believe, from the depths of a Computer Shopper magazine.

by randomdrake

4/10/2026 at 2:27:57 PM

I loved my 486DX2 66Mhz based IBM PS/1 (2168), which had a whopping 8MB of RAM. Not only did it really enable me to experience the fullness of PC gaming of the era, but it was the first computer I was able to install an internal modem into, and the computer I used to get SLIP dial-in access to the state university mainframe and thus to the Internet (prior I was limited to Prodigy walled garden). It was this computer that let me play early MUDs via telnet, let me play my first graphical MMORPG (Ultima Online), and and introduced me to real visual programming (Visual Basic).

To a significant degree, the 486DX2 was the primary computing platform that created the foundation I needed to learn computing at depth and enabled my later career, and really set many of the formative moments in my life. Thanks Intel, even though you suck now as a shadow of your former self you were a beast in the 90s.

by tristor

4/10/2026 at 12:57:47 PM

How was the person incorrect that speed increases won't continue forever? Pentium 4 was 3.8GHz and Ryzen 7 has 4.7Ghz some 20 odd years later?

by welfare

4/10/2026 at 2:32:43 PM

While the speed increases weren't as dramatic, do note that even in single core speed, unlike the clocks would suggest the Ryzen 7 is much, much more than 1.23X faster than the P4. The P4 was a particularly fragile architecture, and achieved IPC on real code was typically well below 1, often closer to 0.5. The x3d variants of Ryzen have been measured at running above 3 average IPC on real, complex loads. So the single-core uplift from that P4 to a modern AMD core is about the same as from that 300MHz Pentium to the 3.8 P4, it just took 20 years, not 8. Of course, now we also have 8 times the cores.

by Tuna-Fish

4/10/2026 at 3:36:58 PM

> How was the person incorrect that speed increases won't continue forever?

Through the magic of saying something different in actuality, which really ended up being proven incorrect. From the blogpost above, verbatim, italicizing the relevant bits:

> Writing in the May 8, 1989 issue of Infoworld, Michael Slater warned that the sixfold speed increase seen from 1981 to 1989, going from 5 MHz to 33 MHz, would not be repeated.

by perching_aix

4/10/2026 at 1:26:52 PM

Clock speeds used to be going up in a straight line (the normal "interpretation" of Moore's law) - but once the P4 hit a (kind of useless 3.8GHz) we leveled off for decades.

by bombcar

4/10/2026 at 2:09:24 PM

(To make it clear, straight line on a log scale. Exponential on a linear scale.)

by Sharlin

4/10/2026 at 2:08:40 PM

A switch from the exponential regime to something immensely slower was a qualitative change. The difference is so vast that it's completely reasonable to say that clock speeds haven't changed a single bit since 2006 or so (and even for raw ops/s speeds, which have improved much more, it's debatable).

by Sharlin

4/10/2026 at 5:05:53 PM

> But when Word 97 arrived with real-time spelling and grammar checking and Clippy, the 486 couldn’t keep up. You really needed a Pentium or equivalent to do all three at once without noticeable lag as you typed.

In other words, faster hardware was needed because the quality and performance of the software dropped. I was doing spell-checking with WordStar on an CP/M Apple II with zero lag -- and WordStar fit on one side of a 5' floppy.

by insane_dreamer

4/10/2026 at 5:27:13 PM

WordStar originally didn’t have a spell checker. It was an add in product. And even after SpellStar was integrated (a response to the NewStar clone’s built-in spell checker), it was never as-you-type spell checking, which is what we got in Word 97, and what consumed the cycles on a 486.

Word 97 also had as-you-type grammar checking, which wordstar never had. Wordstar did have an add in extra cost grammar checker whose name escapes me at the moment. But again, it was never real time.

Yes, programs have become bloated, but it is worth it to compare apples to apples.

One might argue that real time isn’t necessary, and one might be right. But that’s different from poorly written.

by compiler-guy

4/10/2026 at 9:34:11 PM

Fair points. I'd argue that realtime spellcheck doesn't provide a lot of value -- when you're writing you want to focus on the writing and go back and fix the spelling when you do the editing.

I'd argue it was a combination of "now we have more processing power lets see how we can use it up" and "we don't have to make so many hard design and programming decisions thanks to the extra power", with the result being that you "had" to get the new chips in order to run the new software that was replacing the old software

Repeat that a number of cycles and we wound up with Windows Vista ;)

Since we're discussing word processors, I would say that WordPerfect5 for DOS was the best word processor I've used to date (Pages on Mac comes in second). It did almost everything that Word does today in terms of word processing (not page layout but Word is terrible at that anyway, you really need InDesign to do that properly), was fast and easy to work with (keyboard shortcuts for operations is much faster than a mouse/GUI), and didn't require nearly as much processing power.

by insane_dreamer

4/10/2026 at 6:23:56 PM

Apples to apples? More like Windows to Windows. LOL

by TheAmazingRace

4/10/2026 at 2:12:48 PM

For me, the 486 was right between my (actually my Dad's) first computer, a 386, and my first personal computer (Pentium MMX). During those couple of years my friends had 486s and I was always jealous. I used to drool at the Best Buy catalog that came every Sunday in the mail.

Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.

sigh

by titzer

4/11/2026 at 5:29:32 AM

dx2 gang

by bpbp-mango

4/10/2026 at 12:46:37 PM

486 SX 33Mhz, could not afford the DX

by christkv

4/10/2026 at 1:08:49 PM

My experience too, as I dimly remember it.

by bluebarbet

4/10/2026 at 1:28:00 PM

The 486 SX was a fine chip, just no math copro.

The 386 SX was a crap, 16 bit wide bus IIRC.

by bombcar

4/10/2026 at 3:08:30 PM

For a time systems with a 386SX were significantly cheaper than those with a 386DX because the 16-bit data-bus mean cheaper motherboards could be used.

If you were running 16-bit software they were little slower than a 386DX at the same clock and significantly faster than a 286 because of higher clocks (286's usually topped out at 12MHz though there were some 16MHz options, the slowest 386s were running at 16MHz with some as fast as 40MHz), but also in part, when not blocked by instruction ordering issues, to the (albeit small by modern standards) instruction pipeline which the 286 lacked.

32-bit software was a lot slower than on a DX because 32-bit data reads and writes took two trips over the 16-bit data bus, but you could at least run the code as it was a full 386 core otherwise (full enhanced protected mode, page based virtual memory, v8086 mode, etc).

The SX also only used 24 bits of the address bus, limiting it to 16MB of RAM compared to the original's 4GB range, though this was not a big issue for most at the time.

by dspillett

4/10/2026 at 1:37:40 PM

Ahhh but it gave me the opportunity to ran real programs, coming from an XT! *Edited to add an example: I could for the first time use AutoCAD. The price difference between a 286 and a 386SX was negligible, but the software I could use, was other league.

by mauriciolange

4/10/2026 at 1:55:44 PM

Yeah by the time we were getting into it the 486 was already out, but we wanted the real 32 bit bus and had to be a bit careful when looking at used computers (as by that time the 386SX and DX machines were about the same price).

by bombcar

4/10/2026 at 2:46:53 PM

I can't remember, could you buy a math coprocessor for it?

I know my 286 you could pair with a 287 next to it.. not sure if it really made a difference you could discern outside of hyper-specific uses though.

by bbarn

4/10/2026 at 3:19:25 PM

There were 387 co-pros, just like the 287s (ad 8087s). You could actually use a 287 to provide floating-point instructions to a 386, albeit more slowly than a 387.

Very little, if any, “home” or small-business software would make use of a floating-point unit though (maybe some spreadsheet apps did?). The most common use for them was CAD/CAM, and those doing scientific modelling without a budget that would allow for less consumer-grade kit.

by dspillett

4/10/2026 at 3:14:41 PM

I believe so the 487 which had a full 486 on board it and disabled your main CPU.

by hypercube33

4/10/2026 at 2:01:51 PM

Great throwback.. they were awesome proc's. With a few Simms (4 - 16 Mb) it could do multimedia madness never seen before (play a CD-ROM game of mpeg1 video) 486dx4 100 was the latest Intel I had before going to Pentium clones. (AMD K series and the shitty Cyrix 6x86)

by marald

4/10/2026 at 7:25:04 PM

Heh, I remember using my first machine, a 486 for a long time after it was obsolete and reading system requirements like, what do you mean pentium recommended and why the hell do you need 16Mb of RAM. It's interesting to reflect that the old games like Settlers, HoMM 2 or Warcraft 2, that are no worse than modern ones gameplay wise, used to run on something that is so vastly underpowered by modern standards the numbers don't even feel like a real spec.

by sershe

4/11/2026 at 8:53:44 AM

don't forget the original Command&Conquer

by Zardoz84

4/10/2026 at 2:47:27 PM

Hard to convey these days how the 486 felt like an absolute quantum leap in computing power.

I built a 486 Compaq Novell server for the company I worked for and named it Godzilla - gives a sense of how the 486 was seen.

by andrewstuart

4/10/2026 at 1:14:47 PM

Uuh! I recall i had this setup, not in 89, but sometime in the early 90s.

Played some awesome games, like DOOM, Wolfenstein. Later duke3d was the shit. But i cant remember if i run on the same setup or something newer.

by phplovesong

4/10/2026 at 12:46:37 PM

It comes over as so incredibly insane to me that people from the late 80s (people working with computers! Reporting on them!) would look at their current technology stack and basically go: "I have no idea whatsoever what else we can do with these things, we've reached the end"

The lack of imagination is just disturbing.

by skerit

4/10/2026 at 1:00:06 PM

On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").

by ahartmetz

4/10/2026 at 1:25:43 PM

The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).

In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.

by bombcar

4/10/2026 at 1:09:53 PM

It's easy to mock in hindsight, but the failure mode isn't lack of imagination. It's extrapolating linearly from physical limits that were real at the time. In 1989, DRAM refresh cycles and bus bandwidth genuinely were bottlenecks that seemed fundamental. What nobody predicted was that the industry would sidestep those walls entirely (caches, pipelines, out-of-order execution, then multicore). Architectural innovation tends to appear orthogonally to wherever the current wall is.

by mc-serious

4/10/2026 at 3:42:15 PM

That's not so different than today, wherein:

All we really have to look forward to in the future of increasing-performance personal computing is doing the same things as yesterday, but doing them faster.

The future after today will probably turn out more interesting than that, of course, but we can't know that until it happens.

And the future after 1988 certainly turned out to be a very interesting time in computing -- but they had no idea what was in store. Perhaps you can use your time machine to go back and let them know?

by ssl-3

4/10/2026 at 1:26:43 PM

The first 80286-based system (IBM PC AT), 80386 (Compaq Deskpro 386), and 80486 all had people writing about their suitability as servers, with the consensus's implication being that normal people didn't need them.

The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.

by TMWNN

4/11/2026 at 1:06:12 AM

Um. That never happened. No-one ever felt that. Not a soul.

Everyone - everyone knew it was the start of a revolution.

by wewewedxfgdf