alt.hn

4/7/2026 at 8:38:17 AM

Every GPU That Mattered

https://sheets.works/data-viz/every-gpu

by jonbaer

4/7/2026 at 5:21:57 PM

I don't believe this list was curated as the title suggests. It's just a semi-random list of popular-ish GPUs with LLM-generated descriptions.

The site looks nice, which fools us into thinking thought and effort was put into this.

by dantillberg

4/7/2026 at 7:38:06 PM

It does start with the Voodoo, so I got what I came for.

by specproc

4/8/2026 at 4:49:03 AM

They do which is fine.

Where's my Matrox?

Matrox Mystique was a combination 2D/3D consumer card, which at the time, was still something that mattered. Sure a Voodoo addon card mattered more, very soon, but then quickly things shifted back to combination 2D/3D card with Nvidia!

Also, how is a "first $2000 consumer card" something that "matters"? That's precisely the kind of thing that doesn't matter. My entire laptop cost less than that and I play games with it. What matters much more is that I can play quite a bunch of games that are even pretty recent with a laptop that cost less than that, all with an integrated graphics chip from a company that is precisely known for having abysmal 3D performance: Intel (I have an Iris Xe)

by tharkun__

4/8/2026 at 8:13:46 PM

Indeed. It was, sadly, more important than popular. Which, to my eyes, comfort the generated content theory.

by ldng

4/8/2026 at 12:48:20 PM

Should have started with ANTIC - the very first programmable video chip I'm aware of...

by spacedcowboy

4/7/2026 at 8:51:18 PM

It didn't start with the Verite, so I didn't get what I came for. :(

by ChrisClark

4/7/2026 at 10:41:43 PM

They had great box art

by pcdevils

4/8/2026 at 9:17:37 AM

Considering they list multiple version of the same generation of card, like the GTX 1080 and GTX 1080TI. This list is just a list of popular cards over the last 30 years

by Dead_Lemon

4/7/2026 at 6:56:43 PM

Dystopian future where AI pumps out slop and uses human feedback and comments to correct the output.

by liquid_thyme

4/7/2026 at 11:07:12 PM

  > uses human feedback and comments to correct the output
tbf, lots of saas have a similar attitude with things like "give us feedback" on their pages; like i'm paying you money to figure this stuff out so why are you asking me if its good or not? with more and more "vibing" i feel this kind of attitude is going to infect everything at some point...

by andrekandre

4/7/2026 at 9:28:03 PM

Isn’t this a bit like spell-checking a Nigerian Prince email? The valuable eyeballs are the ones that don’t notice or don’t care.

by addaon

4/8/2026 at 5:11:01 PM

Sure, If ranking is done purely based on clicks and not quality. I'm just thinking of it as a meta "loss" function in the AI context. So I'd say its the passionate enthusiasts who care enough to provide feedback on such topics.

by liquid_thyme

4/7/2026 at 8:21:40 PM

Dystopian present

by sealeck

4/7/2026 at 10:17:21 AM

It's probably just me being out of touch, but I don't think the GeForce RTX 4000 or 5000 series really mattered/matters that much.

At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.

by mrweasel

4/7/2026 at 11:51:53 AM

Or the S3 Savage3D, which, while being inferior to the TNT2, pioneered texture compression.

https://en.wikipedia.org/wiki/S3_Texture_Compression

by mizzack

4/7/2026 at 2:20:18 PM

Loads of games from the era roundtripped their textures through lossy S3/DXT compression and then stored them as uncompressed RGB or RGBA.

I know this because I wrote a Unreal Engine texture repacking tool with a "DXT detection" feature so that I wouldn't be responsible for losing DXT compression on a texture which had already paid the price, only to find that this situation was already hyperabundant in the ecosystem.

Many Unreal Engine games of the day could have their size robotically halved just by re-enabling DXT compression in any case where this would cause zero pixel difference. This was at a time before Steam, when game downloads routinely took a day, so I was very excited about this discovery. Unfortunately, the first few developers I emailed all reacted with hostility to an unsolicited tip from what I'm sure they saw as a hacker, so I lost interest in pushing and it went nowhere. Ah well.

by jdewerd

4/7/2026 at 2:43:16 PM

The article blew a huge opportunity to showcase the great diversity of “Pioneering Era” 3D accelerators (they weren’t called GPUs until later). But instead they just pretended it was always NVIDIA vs ATI, and threw in a few Voodoos.

by ryandrake

4/7/2026 at 3:13:07 PM

It was only 3dfx and NVIDIA (since the TNT) that mattered in the 1990s though. All the other 3D accelerators were only barely better than software rasterization, if at all.

Seeing Quake II run butter smooth on a Riva TNT at 1024x768 for the first time was like witnessing the second coming of Christ ;)

by flohofwoe

4/7/2026 at 4:15:28 PM

Rendition's VQuake was actually pretty good, more than barely better than software rasterization.

by djmips

4/7/2026 at 8:54:13 PM

Edge anti-aliased polygons!

by ChrisClark

4/7/2026 at 8:53:56 PM

Before that, you could even run Quake with anti-aliasing on one of those "barely better than software rasterization" cards, couldn't even be done on the first Voodoo cards.

by ChrisClark

4/7/2026 at 4:19:32 PM

And they say that Nvidia coined the phrase GPU - but I recall that Sony did it earlier... not that it really matters.

by djmips

4/7/2026 at 1:12:37 PM

+1 to that, when i first saw unreal tournament with the add-on compressed texture pack was a real WOW moment.

by aruametello

4/7/2026 at 5:24:50 PM

Yeah it also lacked driver support. But it was for a very brief moment the king of the hill.

by holoduke

4/7/2026 at 6:12:01 PM

My contributions: Matrox Parhelia for the first card supporting triple-monitors, and ATI All-in-Wonder which did TV out when media centre TVs weren’t really a thing.

by xattt

4/7/2026 at 7:31:00 PM

The big feature of the All-in-Wonder was TV in. You could record, in glorious analog detail that could quickly use up your entire hard drive.

by MBCook

4/7/2026 at 11:13:22 PM

I can remember using an AiW card to play PS2 on my computer screen when my TV died. The latency wasn’t great but we still had fun.

by doubled112

4/8/2026 at 3:56:20 PM

> RTX 4000 and 5000

These GPUs have made DLSS and frame generation usable technologies, getting you reasonable 4k gaming on a budget.

It’s not perfect yet, but almost all new games support it and despite the widespread complaints, very few people actually disable these features.

by fooker

4/7/2026 at 2:22:25 PM

I remember there was a kernel module for the Matrox/MPlayer combination. You get a new device that MPlayer could use. You did get `-vo mga` for the console and `-vo xmga` for X11; you couldn't tell the difference, and both produced high-quality hardware YUV output.

by gen2brain

4/7/2026 at 3:48:10 PM

For a moment, a Matrox G400 DualHead was THE card to have for a multi-monitor setup.

by tbyehl

4/7/2026 at 5:16:42 PM

This was a very sweet video card.

by rangerelf

4/7/2026 at 11:35:48 AM

Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.

by whizzter

4/7/2026 at 1:31:36 PM

4000 certainly did, the "shader execution reordering" gave an meaningful uplift to tasks that "underutilized warp units due to scattered useful pixels".

it seems to have helped path tracing by a lot.

by aruametello

4/7/2026 at 1:35:59 PM

I think their point is RTX is not useful.

by LoganDark

4/7/2026 at 3:09:00 PM

> S3 ViRGE and the Matrox G200

Both were only really famous for how terrible they were though. I think the S3 Virge might even qualify as 3D decelerator ;)

by flohofwoe

4/7/2026 at 4:34:53 PM

The only thing the ViRGE was good for was passing through to a Voodoo2

by pak9rabid

4/7/2026 at 7:32:03 PM

But it WAS ultra popular with OEMs. If you had embedded video there was a huge chance that was it.

by MBCook

4/7/2026 at 6:43:26 PM

Matrox was really halfhearted with game support. They seemed far more interested in corporate customers, advertising heavily stuff like "VR" conference calls that nobody wants. They were early with multi-monitor support back when monitors were big, heavy, and expensive. I had a G200 that was the last video card I've ever seen where you could expand the VRAM by slotting in a SODIMM. It also had composite out so you could hook it to a TV. I played a lot of games on it up until Return to Castle Wolfenstein, which was almost playable but the low res textures looked real bad and the framerate would precipitously drop at critical times like when a bunch of Nazis rushed into the room and started shooting.

Last time I saw a Matrox chip it was on a server, and somehow they had cut it down even more than the one I had used over a decade earlier. As I recall it couldn't handle a framebuffer larger than 800x600, which was sometimes a problem when people wanted to install and configure Windows Server.

by jandrese

4/7/2026 at 10:30:53 AM

The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.

by formerly_proven

4/7/2026 at 10:52:45 AM

Like virtualized NICs pretending to be an NE2000? That's interesting, do you know why they'd use a G200 and not something like an older ATI chip?

by mrweasel

4/7/2026 at 2:34:01 PM

The ATi Rage 128 was used in everything short of toasters for a long time too. I assume that the drivers are part of what made it obsolete.

by hypercube33

4/7/2026 at 7:06:29 PM

I remember having a ton of servers with cut down Mach64 chips. They were so bad that you would get horizontal lines flickering across the screen while text was scrolling in an 80x25 text console. I don't know why server manufacturers go to so much effort to make the console as terrible as possible. Are they nostalgic for the 8 bit ISA graphics from the original 5150? They seem offended at the idea that someone might hook a crash cart directly up to their precious hardware.

by jandrese

4/7/2026 at 7:08:07 PM

They were probably forced to update when they dropped older busses. Without a PCI or AGP bus on there they have to find something that can hang off of a PCIe lane.

by jandrese

4/7/2026 at 11:18:25 AM

Probably started out as a real G200 chip which might’ve been the cheapest and easiest to integrate in the 2000s? Or it had the needed I/O features to support KVM (since this would’ve involved reading the framebuffer from the BMC side), or matrox was amenable to adding that.

by formerly_proven

4/7/2026 at 1:16:59 PM

Drivers, probably.

by bluedino

4/7/2026 at 4:20:15 PM

Even current Dell servers less than a year old ship with G200 graphics. If it works, why change it? A 1998 ASIC can be put in the corner of a modern chipset for pennies or less.

by jeffbee

4/7/2026 at 11:54:48 AM

G200 Matrox GPUs came integrated with servers for absolute ages,like past 2010's

by PunchyHamster

4/7/2026 at 1:12:24 PM

This is an ad from viral marketing company and everyone here is falling for it.

by cubefox

4/7/2026 at 8:41:07 PM

>This is an ad from viral marketing company

they arent a marketing company:

"Dashboards, CRMs, automations. We're a small consulting team that turns your messy spreadsheets into systems that run your business."

by john_strinlai

4/7/2026 at 9:07:22 PM

They are: https://sheets.works/data-viz/hire

by cubefox

4/7/2026 at 10:31:44 PM

huh, my mistake, you are correct. they apparently do offer marketing visualizations, they just dont mention it anywhere on their home page apparently.

by john_strinlai

4/7/2026 at 5:18:21 PM

What are they advertising? Nvidia graphics cards?

by izzydata

4/7/2026 at 5:41:19 PM

Yes. They are likely also advertising for themselves with how viral their ads are. The article is featured on their website.

by cubefox

4/7/2026 at 4:11:46 PM

>S3 ViRGE

decelerator?

>Matrox G200

because it never got opengl driver? Because it was 2x slower than even Savage3D? Nvidia TNT released a month later offering 2x the speed at lower price

https://www.tomshardware.com/reviews/3d-chips,83-7.html

truly a graphic card that mattered! :)

by rasz

4/7/2026 at 9:41:31 PM

From memory the cards that stood out were

Nvidia 6xxx series, which was the first card to support SLI. I remember my gaming pc in college with 6x series card, and being able to get another card and use and SLI bridge that increased performance in some games.

Nvidia GeForce 900 series, which had the Titan with 12gb, first card iirc to able to support larger resolution gaming.

Nvidia RXT series which started with 20xx i think, first card to come with 24gb of ram.

And then the modern 4xxx series which used to fry power cables.

by ActorNightly

4/7/2026 at 9:07:53 AM

A lot of GPUs in this list are basically just previous GPU but faster or more RAM. I kind of thought it was going to focus on interesting new architecture innovations.

by __alexs

4/7/2026 at 4:25:38 PM

Not only that but a lot of the "defining games" are just games that appeared at about the same time but can be handled by much older GPUs without issues. For me graphics haven't made a real difference since Unreal Engine 4 anyway. It's all about the content these days, not the skin.

by ndarray

4/7/2026 at 3:25:25 PM

does anyone have pointers to similar articles that talk about GPU history?

One example is "No graphics API" by Sebastian Aaltonen shared here 3 months ago, which is a great tour de force of graphics stack innovations through contrasting the history of OpenGL/Vulkan and WebGPU/Metal development: https://news.ycombinator.com/item?id=46293062 Because it requires an in-depth understanding of the shader pipeline, the article touches on significant graphics cards of the era. I'd love to see more about that!

by gcr

4/7/2026 at 12:27:31 PM

like the PS3? seems like everything is using PC architecture now. it does have RDNA.

by koolala

4/7/2026 at 9:45:13 AM

Honorable mention, the Rendition Vérité 1000 https://fabiensanglard.net/vquake/index.html

Released before the Voodoo 1 with glquake and gl support for Tomb Raider.

by vman81

4/7/2026 at 11:39:58 AM

Agreed, those early manufacturers/models that experimented more feels more relevant than the more incremental listings of multiple 2000 3000 and 4000 series NVidia GPU's.

by whizzter

4/7/2026 at 2:59:05 PM

This sent me down a huge rabbit hole of memories and reading. Thank you! I remember everyone being hyped for that card for their first Pentium / Pro builds at the time but I think a lot of people held off for the Pentium II and a TNT or Rage 128 card that I was hanging around with.

by hypercube33

4/7/2026 at 11:47:33 PM

Look at that! I thought I had a fairly decent overview of the history of GPU but this is a card I have never heard of before. Thanks for this.

by HerbManic

4/7/2026 at 1:10:43 PM

Very interesting culture difference between rendition and 3dfx in their chip design approach..

by jnpnj

4/7/2026 at 1:19:20 PM

its a very honorable mention in my eyes because its more appropriate of the tile of "first independent Graphics unit" than the Geforce 2. (did more than just blast already projected triangles at the screen)

not that it was an awesome product, but certainly it was flexible.

a good (albeit tiny) demo of that is that vquake has the same wobbling water distortion of the software renderer quake but rendered entirely through the gpu. Perhaps with some interpretation this could be called the "caveman discovered fire" of the pixel shading era.

by aruametello

4/7/2026 at 7:52:55 PM

The first fully programmable gpu being a mips cpu core with bolted on stuff. To bad about the hardware bugs. Was my first accelerator with the creative 3d blaster

by christkv

4/7/2026 at 9:15:21 AM

I think pairing RX 5700 XT with Control as the "defining game" is an interesting choice, considering the facts 1. AMD cards were incapable of RT at the time and 2. Control was basically the first game with a good, comprehensive RT implementation that had a massive positive impact on the graphics.

by paavohtl

4/7/2026 at 9:36:50 AM

> massive positive impacts on graphics

I remember the main noticeable difference being ray traced reflections. However that was mostly on immovable objects in extremely simple scenes (office building). Old techniques could've gotten 90% there using cubemaps, screen space reflections, and/or rasterized overlays for dynamic objects like player characters. Or maybe just completely rasterize them, since the scenes are so simple and everything is flat surfaces with right angles anyways. Might've looked better even because you don't get issues with shaders written for a rasterized world on objects that are reflected.

Games that heavily advertise raytracing typically don't use traditional techniques properly at all, making it seem like a bigger graphical jump than it really is. You're not comparing to a real baseline.

Overall that was pretty much the poorest way to advertise the new tech. It's much more impressive in situations where traditional techniques struggle (such as reflections in situations with no right angles or irregular surfaces).

by chmod775

4/7/2026 at 2:04:01 PM

The most impressive part of Control's RT (on PC at least) was that it very much applied to (most) dynamic objects - and it features a TON of dynamic destruction.

The "office building" setting meant resticted areas, sure, but it features TONS of reflections - especially transparent reflections (which are practically impossible to decently approximate with screen space techniques).

Oh, and: The Northlight Engine already did more than most other engines at the time to get "90% there" with a ton of hybrid techniques, not least being one of the pioneers regarding realtime software GI.

by dahauns

4/7/2026 at 9:55:48 AM

The other elephant in the room is the consoles, and even if they're capable of RT they also have to consider the performance capabilities versus visual payoff. As I see it the PC versions of games like Control from studios like Remedy are trailblazers, it's an early implementation (geforce 20 released in 2018, Control was 2019) as the ultra option to shakedown their implementation and start iteration early so future games will benefit, however the baseline is non-RT.

by keyringlight

4/7/2026 at 8:57:04 AM

Absolute nostalgia fever. About a month ago, I dug up an old desktop in the corner, took the drives out and gave away the machine. It felt like putting a racehorse to pasture: i7-4790k, 1080 Ti. It was my dream machine when I got it. Dual-boot (as we did back in the old days when Proton wasn't here) to Ubuntu, then Elementary, then Arch. By the time I gave it away it wasn't worth the power cost.

And that brought to mind my older dream machine, an 8800 GT from generations past, before which we made do with a Via Unichrome that worked sufficiently enough on the OpenChrome driver that I could edit open software (Freespace only needed a few constants changed) so it would render (though some of the image was smeared and so on I could play!).

by arjie

4/7/2026 at 9:36:38 AM

I'm still rocking a Z97, i7-4790k and a 980Ti :) I'm still waiting until I need an upgrade. DDR3 is still performing good enough for the games I run.

by ramon156

4/7/2026 at 10:10:33 AM

Same. Still play StarCraft2 on a 4790k and AMD R9 Fury X.

by karmakaze

4/7/2026 at 10:27:35 AM

I was running a 970ti for the longest time, it was only when I wanted to get into some VR gaming that it was time for an upgrade.

by kawsper

4/7/2026 at 4:34:20 PM

It is interesting the consumer high you get from buying things. I remember being in a microsoft store like 12 years ago and wanting this Surface laptop. Holding it in my hands but I couldn't afford it. Now I have a Surface Book 3 and it's still cool but not the same experience as it being a flagship/new at the time.

Still there are a lot of laptops I'd like to try when they get cheaper. As far as GPUs I like the Nvidia founder designs, it was a while before I got a 3080 Ti Fe that I ended up having to sell at a loss when I didn't have a job that was sad. I have a 4070 founders now which does struggle on certain games at 1440p but I'm going to use it to run local LLMs.

by ge96

4/7/2026 at 11:56:03 AM

I also have that exact setup sitting around, but am just using my ryzen laptop now.

by sva_

4/7/2026 at 7:13:08 PM

My current machine is an i5-3570k with a 1070Ti...

The old CPU is actually more of an issue. I couldn't run Civ 7 because the game (probably the DRM) uses some instructions that aren't implemented on that CPU. Other than that I bet it would run just fine.

I was just about to upgrade before hardware prices went through the roof. Now I'm just holding on until some semblance of sanity returns, hoping every day that the bubble pops and loads of gently loved hardware starts appearing on the secondary market. Also, the way nVidia has been skimping on memory for all but the most outrageously expensive chips has grated on me. I was really hoping they would buck the trend with the 5xxx generation, but nope, and with RAM prices the way they are I have little hope for the 6xxx generation. My current card is close to a decade old and has 8GB of VRAM. I'm not upgrading to a card with 8GB of VRAM, or ever 12GB. That 8GB was crucial in future proofing the original card, none of its 4GB contemporaries are of much use today.

by jandrese

4/7/2026 at 1:17:55 PM

My truenas scale server still happily running on a i7-3770.

by alasano

4/7/2026 at 12:02:38 PM

I used my 1080 Ti for about eight years. The successor GPU is in some ways way faster (raytracing, AI features etc.), but in others really quite stagnant considering the huge stretch of time that passed between them. ~10 years for 2-3x performance in GPUs at higher nominal and real price points shows how slow silicon advances have been compared to the 90s and 2000s. The same period from 2000 to 2010 would've seen 1000x performance if not more. The difference between a 1080 Ti and a more expensive RTX 50 card is the RTX can render ideally triple the frames in synthetic benchmarks, double the frames in some rasterizing games (most games won't see gains that high), and do a few relatively tame raytracing tricks at performance which is still not really good. At the same throughput it consumes maybe half the power or a bit less. The difference between a GeForce 2 and e.g a Radeon HD 4k is several planes of existence.

by formerly_proven

4/7/2026 at 3:20:37 PM

My 1080ti is still working away in my kid's PC. If you connect a 1080p monitor, it will still hit 60fps in mostly everything.

The only thing that holds this card back now is a handful of titles that will not run unless ray-tracing support present on card - Indiana Jones and The Great Circle springs to mind etc.

I am very likely going to get a decade of use out of it across three different builds, one of the best technology investments I've ever made.

by giobox

4/7/2026 at 7:18:00 PM

It really is an impressive bit of hardware. I finally pulled it out of my last system a year ago, but it was definitely holding its own up until that point.

by strictnein

4/7/2026 at 5:35:38 PM

Well. The 5090ti is significantly faster than a 1080ti. It has 92b vs 12b transistors. That's the 10 years difference you mention. 10 years before the 1080ti we had the 8800 ultra with 600m transistors. So yeah you are a bit right. But stacked transistors in the future might become reality and enable transistor increase again.

by holoduke

4/7/2026 at 8:44:29 PM

A 5090 is more than twice as expensive as a 1080 Ti in real MSRP terms and way more than that in actual real terms, since the 1080 Ti was available for some time below MSRP, while the 5090 realistically never was and usually goes for 50-100% above MSRP. So I don't think these can be compared. Basically a similar story with the 5080, it's significantly more expensive in real terms (and about ~2x in nominal terms).

The 5070 Ti would be the same spot.

If you compare these - the RTX 50 card has a bit higher TDP (which it will usually not reach due to clock limits), is a roughly 100mm² smaller die with around 4x the transistors and about 3x the compute (since much more of the chip is disabled compared to the 1080 Ti's chip). It has 5 GB more memory (11->16) and a lot more bandwidth.

by formerly_proven

4/7/2026 at 9:22:31 AM

Hey, I could have used that i7-4790k!

I've been running the worst gaming set up I can get away with, which atm is a 3080 10gb, using random DDR3 ram, a budget WD 512gb ssd, and an i5 of the same socket as the i7-4790k that doesn't even support hyperthreading and can't do more than 4 tasks in parallel.

It's absolutely laughable at this point, but I'm unironically looking for a deal on that cpu lmao, it would be a huge upgrade.

by brailsafe

4/7/2026 at 1:25:51 PM

This is a wonderful-looking infographic, but I truly don't think there are 49 GPUs that mattered in the PC gaming hardware space - let alone all of computer graphics. Call it recency bias, but after the Pascal cards it feels like maybe one or two more entrants actually mattered?

by Shalomboy

4/7/2026 at 3:18:31 PM

Pascal is already too late to matter (2016) IMHO.

With the release of D3D9 in 2002, GPUs of different vendors didn't really stand out anymore since they all implemented the same feature set anyway (and that's a good thing).

by flohofwoe

4/8/2026 at 2:34:23 AM

Nvidia Turing (RTX 20) definitely marked a major shift IMO.

- It was the first card to enable real-time ray-traced effects. - Mesh shaders are a significant overhaul of the geometry pipeline that's only recently getting real traction. - Its tensor cores enabled a new generation of AI-driven upscaling/antialiasing. DLSS 2, FSR 4 and XeSS are all some variation of "TAA + neural networks", and these all rely on specialized matrix hardware to get optimal performance.

Obviously all of these features are supported across all vendors. Intel Arc Alchemist has all of these features as well, and AMD got RT and mesh shader support with RDNA2 along with slowly building up to tensor cores with RDNA3/4. But Turing clearly debuted these feature which have majorly changed the landscape of realtime 3D graphics.

by ColonelPhantom

4/8/2026 at 7:28:46 AM

Kind of, because they still had different kinds of limitations, and that played a role in what is available to shaders ever since.

Just like nowadays not all mesh shaders, compute, ray tracing or DirectStorage are born alike across all vendors.

Usually one can expect that an Intel GPU will never deliver as much as one AMD or let alone NVidia one.

Naturally this is us focusing on the PC space, if taking mobile, game consoles, or Apple ecosystem, then there are many other factors.

by pjmlp

4/7/2026 at 3:49:23 PM

IMO there’s room for something more recent, maybe a Titan or something, to stand in as an avatar for making GPUs as compute accelerators a thing. I know that’s been going on forever, but at some point it went from some niche hacky thing to a primary use-case for the cards.

But yeah this list has a on of incremental bumps on it. Maybe there was some mixing of cards that mattered historically and cards that mattered to the author.

by bee_rider

4/7/2026 at 9:06:41 AM

The 8800 GT is easily the most impactful GPU in my mind. The combination of that video card with valve's Orange Box was insane value proposition at the time.

I'd put the 5700xt at #2 for being the longest lived GPU I've owned by a very wide margin. It's still in use today.

by bob1029

4/7/2026 at 10:23:31 AM

Came here for this ommission. I saved up for a long time to get an 8800 GTX, and I had that card for 5 years before upgrading again.

by aeonik

4/7/2026 at 9:44:45 AM

I retired my 5700 XT a few years ago. Wasn't there some kind of hardware problem with it? It kept locking up my Linux kernel.

by skerit

4/7/2026 at 10:54:48 AM

Still using my RX 5700 XT. The amdgpu driver had a major issue resuming from suspend a few months ago[0], but other than that, I'm not aware of (nor have I experienced) any stability issues. Maybe you had a bad card.

0: https://gitlab.freedesktop.org/drm/amd/-/issues/4531

by MrDOS

4/7/2026 at 10:38:07 AM

I don't like to spend much on hardware, so I bought an 5700XT a few years ago and run a "steam machine" of sorts. Never had any Linux-related problems.

by exitb

4/7/2026 at 4:01:19 PM

There's no horizontal scroll bar, apparently I need to click and drag the GPU section leftwards with the mouse. (Am I old now?)

by andai

4/7/2026 at 4:55:02 PM

Kind of funny that a someone making a website about desktop hardware didn't expect anyone to use desktop hardware to look at their site.

by kiddico

4/7/2026 at 4:39:29 PM

Everything is designed for phones now.

Apparently it's a Millenial trait to insist on doing things with a "big screen".

by dleslie

4/7/2026 at 5:39:46 PM

Fixed it, thanks for the feedback

by akashwadhwani35

4/7/2026 at 9:03:54 AM

That mattered on the PC evolution, it misses many others e.g TMS34010.

https://en.wikipedia.org/wiki/TMS34010

by pjmlp

4/7/2026 at 1:35:24 PM

SGI IMPACT would be another

by jeffbee

4/8/2026 at 3:00:37 AM

Glad to see I'm not the only person that noticed the lack of SGI stuff.

by gt0

4/7/2026 at 2:32:21 PM

Or the RealityEngine, but really any SGI - that's kind of where modern 3D computer graphics began.

by wk_end

4/7/2026 at 8:37:15 PM

Ikonas 3000

by kps

4/7/2026 at 2:56:16 PM

Sure. I picked IMPACT for having hardware textures. An interesting list would contain the first examples of processors that had things we still have today, like geometry processors. It would also contain evolutionary dead ends that tried to do things differently, like the Rendition Vérité.

by jeffbee

4/7/2026 at 3:30:29 PM

I have a strong feeling that this website was designed by Claude Code using the /frontend-design skill.

by xcodevn

4/7/2026 at 4:56:54 PM

Ok? Not bad to be able to throw something like that together with minimal effort. Works nicely enough.

by alienbaby

4/8/2026 at 12:38:35 PM

It's pretty, but almost exclusively focused on ATI/AMD and Nvidia with an intel thrown in for fun. Not actually a list of important GPUs.

by vman81

4/7/2026 at 2:40:48 PM

Matrox needs a mention somewhere. GPUs do raster too, and theirs optimized for an entirely different market.

by snarfy

4/7/2026 at 4:01:50 PM

I remember the Millennium as the first 3D accelerator. It didn't do texturing, but a lot of the games didn't need it yet. It still did gouraud shading.

by qingcharles

4/7/2026 at 9:46:57 AM

I really want to see TDP over time.

If I can at least tell myself that our technological achievements come with efficiency gains instead of just apeing power throughput, I can rest a little better

by tetris11

4/7/2026 at 10:38:42 AM

Here's one anecdotal datapoint:

About a decade ago, I discovered that the HD 530 iGPU included with my budget-oriented i3-6300 CPU was better-performing than the physically-impressive SLI pair of 9800GTs I had been using, at something like 1/10th the power consumption.

(It didn't do PhysX, but nobody cared.)

by ssl-3

4/7/2026 at 10:03:17 AM

Missing the Radeon RX Vega 64!

by Tepix

4/7/2026 at 9:30:04 PM

I got a Vega 56 still running still impressive memory bandwidth

by christkv

4/7/2026 at 9:45:33 PM

> Apple chose the Rage 128 [Pro] for the original iMac G3, making it the most popular Mac GPU of its era.

This is misleadingly-worded because the original iMac had a 3D RAGE ⅡC, the five-colors models had 3D RAGE Pro, and the slot-loading models had the earlier RAGE 128 VR.

Yes those are all confusingly named by ATi :p

But based on the timeline and features mentioned, they're specifically talking about this one and not any of the earlier chips in the RAGE family: https://en.wikipedia.org/wiki/ATI_Rage#Rage_128_Pro_/_Rage_F...

iMacs didn't ship with RAGE 128 Pro until the year-2000 Indigo & DV models, by which time the RAGE 128 Pro was already 11 months old: https://everymac.com/systems/apple/imac//faq/imac-g3-video-p...

by Lammy

4/7/2026 at 8:52:25 PM

I don't think much of the "defining game" thing. Many of them feel like they're just thrown in as a big game at the time - Diablo 2 is an amazing game, and was very popular, but it wasn't fully 3D and the resolution was so limited I don't think there was usually a need to buy a new video card to play it (in fact I think it might have been just fine in software most of the time).

by CamouflagedKiwi

4/7/2026 at 7:20:54 PM

Awwww..., this brings so many memories. I had almost all of the early ones: Voodoo 2, Riva TNT2, then GeForce 3 (I think...). Then I switched to laptops and didn't have a discrete graphics till last year when I started playing with LLMs locally. So basically I jumped from GeForce 3 to RTX 3090 :) Thank you for bringing those memories back!

by alentred

4/7/2026 at 2:59:35 PM

Well my 9070 XT made the list; I've been quite happy with it, great performance with paying the Nvidia tax.

RIP my Radeon 7500 from high school though, that was always a budget card, and we all had them but wanted the 9700. Couldn't beat the box are from that era though: https://www.ebay.com/itm/206159283550

by mikepurvis

4/7/2026 at 2:40:27 PM

Rx580 is on there, but not the R9 290. I’m not sure where the Rx500 series actually pushed technology forward. They always seemed like the AMD budget line. And if 580 is important, why not the 590 or the 570?

Few of the “pre-GPU” graphics accelerators that seem to have mattered are here. The ViRGE. The Mach32 and Mach64. The Trident cards, like the TGUI9440. Yet the Voodoo often isn’t considered a GPU and is on the list.

by cestith

4/7/2026 at 2:48:54 PM

The 590x was great and lasted me around 5-6 years until I picked up a replacement, but it was really just a rebadged 580.

The 580 is a solid card that was an excellent price/performance value and held a respectable spot in the market for a very long time. Many video games now use is as the entry level bar for playability.

It doesn't hold the same "type" of spot, but it's a workhorse in the same way something like a NVIDIA 1070 was.

by esseph

4/7/2026 at 11:49:09 AM

I don't understand this - where's Trident VGA?

by blackhaz

4/7/2026 at 10:25:29 AM

We had the Riva TNT2 in our family computer, so that was fun to see that again, I think it was paired with an AMD K6-2 chip.

One day one of my friends from school wanted to optimize airflow in our computer, and re-did the cabling, but he managed to block the CPU-fan from spinning. I am not sure how, but we didn't realise it for a couple of months.

When I got my own PC, it had an AMD Barton chip, and it allowed me to play Half-Life 2.

by kawsper

4/8/2026 at 4:34:10 AM

Compute stopped behaving like a consumer good and started behaving like an infrastructure; the prices went from competitive cycles to higher while the performance kept compounding and that’s usually what happens when something becomes a bottleneck for entire industries and not just for end users so the gap between what people use and what’s at the frontier says it all.

by latentframe

4/7/2026 at 1:00:03 PM

I had the Voodoo 1 with VGA passthrough from the 2D card. When you loaded a game you'd head a little clunk from a relay on the Voodoo taking over the VGA signal and you knew you were about to have a good time. Doesn't seem that long ago!

by Neil44

4/7/2026 at 5:04:13 PM

I'd be really interested to see SGI on this chart. When did consumer hardware exceed what you could do on an SGI box?

I think Sun and HP had some 3d capabilities, but it was mostly aimed at engineering/CAD

by paddy_m

4/7/2026 at 9:08:18 PM

The sgi stuff was also engineering focused. The net result was it was not really that fast, powerful sure, but my understanding is the early consumer cards(voodoo) could run rings around them. The game cards did not have the z-buffer depth, fill rate, 3d texture support, line drawing, that sgi's had(cad features), but they could keep the frame rates high and had more features that made the games look pretty.

My personal favorite sgi from the mid 90's was the o2. It had a unified memory model so it was the slow red headed step child of the sgi ecosystem. But because of that unified memory you could effectively pack it with close to a gigabyte of texture memory, whatever the OS and app did not need. This was an obscene amount in 1996. For comparison the top of the line sgi desktop system at the time had 8 mb of texture memory. It does not hurt that the o2 was probably the best designed and engineered computer I have ever seen.

https://computers.popcorn.cx/sgi/o2/o2-05.jpg

https://computers.popcorn.cx/sgi/o2/

by somat

4/7/2026 at 8:51:13 PM

The Nvidia NV1 mattered even if it was a misstep.

I'd say Voodoo 3 mattered because it killed 3dfx.

And the Matrox Parhelia mattered for much the same reason.

by RantyDave

4/7/2026 at 10:04:13 PM

Fails to mention any TBDR-based GPU at all. Do PowerVR and Qualcomm not exist? Or hell, Apple?

by slabtickler

4/7/2026 at 8:58:15 AM

This brings so many memories. I remember how badly I wanted an GeForce 6800 Sadly, I was never able to justify spending this much money on a GPU. Still holds true, even today.

by 0x70dd

4/7/2026 at 10:54:56 AM

I had the 6600 GT, insane price-perf ratio, kept it for like 8 years

by yread

4/7/2026 at 10:19:08 AM

I have fond memories of lending a Voodoo 2 from a friend when I was moving from a 486 to a K6 based system component by component. At that time I was still using my old ISA VGA card, which meant 2D performance was horrible, and I couldn't really watch videos on that thing - but thanks to the Voodoo I could play Unreal Tournament without problems.

by finaard

4/7/2026 at 4:03:12 PM

I wouldn't call a card like the 5080 important. It was incremental compared to the previous generation, a poor value for money, and was awkwardly placed - being very cut down compared to the 90 class of that generation - significantly more than earlier generations.

by Night_Thastus

4/7/2026 at 2:01:55 PM

Missing the Rage Fury Maxx, finest welding job by the boffins at ATI, severely hampered by software support.

by silversmith

4/7/2026 at 11:05:39 AM

I don't see my first GPU on there, it was the humble GeForce4 MX440. It could run almost any game I cared about for a surprisingly long time, even if it's not a true modern card. These days almost all my machines are on iGPUs baked into the CPU. There's way less fun for me, but they are a lot more compact at least.

by Lwrless

4/7/2026 at 3:07:31 PM

The GeForce 4 generation as a whole, while being solid enough cards, were historically not interesting. They were just basic spec bumps over the GeForce 3. No new features or similar. And, critically, the 9700 Pro released the same year as the GeForce 4 and absolutely smoked the living shit out of it.

by kllrnohj

4/7/2026 at 3:27:47 PM

And the 4 MX versions were GeForce 2 MX based IIRC. 3 was expensive.

by fabioborellini

4/7/2026 at 3:33:52 PM

The MX440 allowed players that were playing games on id Tech 3 to finally play at high frame rates. I remember this card being all the rage back then in pro gaming circles for this reason.

by uncivilized

4/7/2026 at 10:48:32 PM

The MX440 was an entry level budget card? If it was all the rage in pro gaming circles at the time that's really just a reflection of how poor pro gamers were back then rather than anything to do with the MX440 being particularly noteworthy. In fact looking back at old reviews, it was if anything a flop. Launch MSRP was too expensive for the performance it offered. Especially when it was a DX7 card surrounded by DX8 cards at almost the same price point (including Nvidia's own Ti4200 for just $50 more)

by kllrnohj

4/7/2026 at 11:16:55 AM

That will probably be my next GPU.

I'm on a 3060 currently and the changes in the 4xxx and 5xxx just aren't appealing to me. As soon as iGPUs get 3060 performance I'll probably switch. And they aren't far off.

by cogman10

4/7/2026 at 11:31:58 AM

The MX440 is a nearly 25 year old GPU, it performed somewhere between a Geforce 2 and GeForce 3 ti 200.

It was a good budget option those decades ago.

by xnorswap

4/7/2026 at 12:07:40 PM

Yes the MX440 deserves to be on this list. More important than the GeForce2 imo.

by uncivilized

4/7/2026 at 9:07:03 AM

Ah I was just trying to remember the model names last week and this website pops up like magic, weird how the internet works sometimes. The 560 Ti was a dream for teenage me and most of my friends back then, but I must say my Radeon HD 4870 game powered most of my favourite Team Fortress 2 years.

by Zealotux

4/7/2026 at 10:44:13 AM

Yeah the 560 Ti was insanely popular in my group of friends. In ~2004 there was a good amount of FX 5700s, some people struggling on Geforce 4, and some on the FX 5900 Ultras. Some were updating every two years, some closer to four. When the 560 Ti came out, everyone got it.

by noxvilleza

4/7/2026 at 11:30:40 AM

The 9400 GT mattered to me as it was my first gpu. Had bought NFS Carbon only to find that the home pc only had a CD drive not DVD lol, so finally with that drive upgrade also came the 9400 GT and fun ensued.

by abhikul0

4/7/2026 at 1:17:54 PM

Not including the Diamond Monster Fusion, the first 2D/3D card, is a glaring omission.

by glitchc

4/8/2026 at 12:05:39 AM

Voodoo shipped with tomb raider. That was its defining game

Diablo II wasn’t even a 3d game.

Many problems on this page.

by scorpionfeet

4/7/2026 at 10:31:19 AM

not a very good list, from a historical perspective it’s missing many important cards, as mentioned by others

also, the gpu did not exist until 1999

looks like this was created for engagement

by momocowcow

4/7/2026 at 10:58:02 AM

1999? You sure?

by bdavbdav

4/7/2026 at 12:32:05 PM

The point is that Nvidia popularized the term, Id guess.

Nvidia called the Geforce 256 the first ever GPU.

by erinnh

4/7/2026 at 5:18:54 PM

Did anyone else notice the decline of graphics on the GPU's coolers! I missed that classic box artwork too!

by deadcore

4/7/2026 at 3:02:35 PM

I remember Voodoo - precisely because I didn't have it back then, as it was a luxury option.

by stared

4/7/2026 at 2:18:44 PM

Cant seem to load the page, is it down? can’t establish a connection to the server at sheets.works

by yasuocidal

4/7/2026 at 4:54:42 PM

To this day I still use RX580 (8GB version) on Linux. This card is really underrated.

by ipmanlk

4/7/2026 at 4:55:55 PM

It's basically AMD's 1080ti, except they just kept making them lol

by kiddico

4/7/2026 at 1:40:48 PM

I remember having the Voodoo card to play Thief: The Dark Project. It felt incredible at the time.

by schnitzelstoat

4/7/2026 at 2:07:09 PM

Wow I stopped following hardware releases after the GeForce 2 and that was in 2004?

by rayiner

4/7/2026 at 12:57:28 PM

I know sheets.works was made with an agent, however, still good taste on the design.

by hchak

4/7/2026 at 1:16:00 PM

I was going the other way, it wasn't obvious enough that it was going to be a horizontal scroll or how to do it. Vertical spacing felt off and the 'defining game' card at the bottom of the video card is nice information but displayed in a distracting manner.

by redorb

4/7/2026 at 1:57:50 PM

Do you think it was entirely AI? Surely some human involved to get this sort of layout..

by DiffTheEnder

4/7/2026 at 10:55:51 AM

Surprised PUBG was the defining game for so many. I don’t recall it being a demanding one.

by bdavbdav

4/7/2026 at 2:21:06 PM

It was just unjustifiably popular.

by sgjohnson

4/7/2026 at 10:00:08 AM

Missed the Voodoo 5 5000 which laid the ground work for nvlink

by BoredPositron

4/7/2026 at 11:07:24 AM

My old GTX770 sitting in a drawer somewhere appreciates this post.

by rjnaisu

4/7/2026 at 1:09:50 PM

"Hey, I wonder what they'll say about SGI Impact."

Oh well.

by justin66

4/7/2026 at 10:14:31 AM

This is such a cool visualization. Thanks for creating it!

by nickel0800

4/7/2026 at 4:52:24 PM

nit: The 9070XT is listed as $599 but that price essentially never existed. I was lucky to get one for $730.

by craftkiller

4/7/2026 at 1:18:42 PM

My GPU is there! Rocking my 980ti since 2015.

by oceansky

4/7/2026 at 10:26:50 AM

The title of site should probably have "for gaming" at the end as it doesn't consider GPUs for compute such as the A100 or the GTX 580 3GB that AlexNet was trained on.

by rythie

4/7/2026 at 5:31:16 PM

S3 Trio, Matrix Millennium

by jbverschoor

4/7/2026 at 2:15:09 PM

Interesting! Through the times

by ananandreas

4/7/2026 at 11:13:15 AM

I was so sad when I retired my 1060 6GB. That thing served me well for almost a decade.

by bobsmooth

4/7/2026 at 8:58:51 AM

Gaming GPUs only which are those we are all nostalgic about, but hardly the ones that matter now for Nvidia.

by sakex

4/7/2026 at 10:00:40 AM

I see it as similar to virtual reality, it was born and grew up with gaming demands and influences, but other disciplines may be more attractive for a mature product

by keyringlight

4/7/2026 at 9:08:17 AM

Turns out corporations and governments can pay way more than individuals.

by Ygg2

4/7/2026 at 9:30:10 AM

Oh, my beloved TNT2 Ultra.

by Xenoamorphous

4/7/2026 at 9:35:00 AM

mine too

by akashwadhwani35

4/8/2026 at 1:09:07 AM

Thanks for the website Claude! By the way the GTX 1080 and 1080ti use the same image.

by Computer0

4/7/2026 at 10:14:44 AM

I think it's a terrible UI - requires 3 different things to see the GPUS: scrolling vertically down to see the Era buttons which then scrolls up and hides the Era buttons even if you have enough vertical screen space, clicking on the Era buttons, clicking < > buttons to see the GPUs of an Era.

I can't remember last time I've seen such a confused design.

by dist-epoch

4/7/2026 at 2:21:49 PM

This wasn't even the worst part for me. To scroll within it as it's horizontal it is not intuitive to use the scroll wheel so you click and drag the mouse , however as the entire surface of the GPU image seems clickable it feels like your going to pull up another webpage. It feels like a bad ad that is trying to catch you off guard.

by elictronic

4/7/2026 at 10:27:58 AM

Appreciate the feedback, fixed it

by akashwadhwani35

4/7/2026 at 5:42:04 PM

Terrible list that should not list almost anything released in the last 10 years. We do live in a very dark and longlasting gpu era.

by PowerElectronix

4/7/2026 at 12:27:20 PM

not the whitehouse.gov design language

by whalesalad

4/7/2026 at 5:06:32 PM

it's Claude's design language just FYI

by airstrike

4/8/2026 at 12:11:06 AM

this was proliferated long before claude came into the scene

by whalesalad

4/7/2026 at 9:23:35 AM

> We build visual stories like this for companies

Combined with the color scheme of this site, this might be a cleverly disguised Nvidia ad.

Edit: Clicking through to their main page [1]: yeah, that's definitely an Nvidia ad.

1: https://sheets.works/data-viz/hire

by cubefox

4/7/2026 at 9:36:53 AM

I made this, and it's not an ad. Chose Nvidia colours, thinking that a GPU website should seem familiar

by akashwadhwani35

4/7/2026 at 9:42:29 AM

You seem to be affiliated with sheets.works, so it appears to be an ad for that site then.

by cubefox

4/7/2026 at 9:59:24 AM

I noticed that the list seemed a little Nvidia heavy when there were absolutely other cards that deserved a mention in the earlier years.

by Chaosvex

4/7/2026 at 9:59:51 AM

I don't think there's strong evidence of this being an ad. I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list. I think it's just that Nvidia has been the dominant force in consumer-level GPUs for a while now.

by forsalebypwner

4/7/2026 at 10:17:38 AM

> I don't think there's strong evidence of this being an ad.

There is strong evidence. Click on the link above. It was posted by a viral marketing company. They even feature the GPU story on their website: https://sheets.works/data-viz

> I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list.

Yes, because otherwise the ad would be too obvious.

by cubefox

4/7/2026 at 9:02:28 AM

Why didn't datacenter GPUs make the list. AI trained with them is such a significant part of computing today.

by charcircuit

4/7/2026 at 10:00:01 AM

Because consumers don't care about them, probably. They're never going to be remembered fondly like gaming cards.

by Chaosvex

4/7/2026 at 10:18:23 AM

Website is called "Every GPU that mattered". The GPUs that trained AlexNet, GPT-1 and 2 are probably the most consequential GPUs in compute history.

by dist-epoch

4/7/2026 at 11:43:49 AM

Sure, I just explained why they probably aren't there. Every GPU that gamers cared about isn't as catchy, I suppose.

by Chaosvex

4/7/2026 at 12:43:02 PM

The reason datacenter cards even exist are gaming GPUs. gaming basically funded GPU development up to the point of AI explosion.

So no, the most important AI card isn't AI card, it's gaming GPUs that funded that mess

by PunchyHamster

4/7/2026 at 4:21:41 PM

If there are 49 cards we have enough to cover both the gaming era of GPUs and the AI era of GPUs.

by charcircuit

4/7/2026 at 5:23:51 PM

This is what I call AI slop.

by holoduke

4/7/2026 at 2:21:37 PM

So so so disappointed by not seeing GTX 1650

Such a capable graphics card it was

by paglaghoda

4/8/2026 at 3:30:32 AM

sgi? evans and sutherland? gl came from sgi originally!

this is just slop

by joshu

4/7/2026 at 12:46:10 PM

You all fell for a marketing site for: https://sheets.works.

I have to say that this site is complete low-effort slop.

by rvz

4/7/2026 at 2:51:26 PM

Yet if it weren't for the people complaining about it being an ad I wouldn't have even noticed who it was an ad for. Thanks for helping them out!

by pavon

4/7/2026 at 10:41:53 AM

>No RX480

Hard pass.

by u8080

4/7/2026 at 9:05:50 AM

[flagged]

by baudmusic

4/7/2026 at 11:39:33 AM

[dead]

by surcap526