3/5/2026 at 1:52:29 PM
> This makes them AMD’s first desktop chips to qualify for Microsoft’s Copilot+ PC label, which enables a handful of unique Windows 11 features like Recall and Click to Do.This is not the selling point they think it is.
The problem I see with the AM5 socket is simply the fact that DD5 RAM to support it is just too expensive. So this will not really make the big impact they were hopin for.
by bilekas
3/5/2026 at 2:45:51 PM
I have a hard time believing ANYONE thinks this is a selling point. Literally anyone, including marketing and execs at Microsoft. I think they have just sunk too much money in it to quit, so they keep doubling down.by davidmurdoch
3/5/2026 at 3:47:18 PM
"The problem I see with the AM5 socket is simply the fact that DD5 RAM to support it is just too expensive."DDR4 is basically just as expensive. At least DDR5 gives on-chip error correction (not as good as full ECC).
For a geneal computer, there's not that much difference between AM4 and AM5 unless you really want the extra speed of DDR5, PCI 5, and the newest processors. You can build a very capable AM4 machine for slightly less money, but that savings is found on the CPU and motherboard, not on the RAM.
by giantg2
3/5/2026 at 10:13:47 PM
The ECC in DDR5 is there just to make it work because of the errors caused by the density and the data rates. The ECC isn’t there for you, it is for the manufacturers.by sitkack
3/5/2026 at 8:01:15 PM
I am still running DDR2 & DDR3 machines! I was going to finally make the big upgrade this year but am now holding off until the market finds a little bit more sanity.by HerbManic
3/5/2026 at 4:53:27 PM
DDR4 looks to be around half the price of DDR5 on the used market to me. I wouldn't call that slightly less money unless you weren't planning to install much RAM.by deltoidmaximus
3/5/2026 at 5:36:50 PM
On the new market, they are really close. Most consumers are buying new. If we want to get pedantic, we can compare buying used systems on Facebook marketplace to cannibalize the parts and resell the others to see net cost.by giantg2
3/5/2026 at 6:26:38 PM
I don't think most people building AM4 systems currently are buying new, or at least not everything new, simply because depending on what you're looking for there might not even be any new parts.by threetonesun
3/5/2026 at 6:47:47 PM
I think for brand new computers/builds that's correct but where it hit me was wanting to upgrade an existing desktop. I already have more DDR4 RAM than I need and would have been willing to purchase a new CPU/motherboard and being forced to also purchase new RAM at the same time made it too big of a price tag all at once. I just found the best zen 3 cpu I could on ebay and called it a day.I think your point still stands overall for AMD's business though, I assume a vast majority of CPUs are purchased in new desktops?
by tuckerman
3/5/2026 at 9:01:45 PM
I built a workstation / gaming pc in 2024, and I feel like I was on the last chopper out of 'namby wing-_-nuts
3/5/2026 at 6:22:17 PM
Agree, i just built a desktop for the first time in ages, it is a leap and change from using laptops with numerous components pplugged into them. i made the leap to desktop. everything was comparably reasonable except the RAM or anything that has memory chip on it ( RAM, NVME etc) so i did some research just to make sure. All in all i happy with the result i went with AMD 9900x no graphics card in this option, i skipped the graphics card for now.by mahirsaid
3/5/2026 at 6:35:39 PM
I would like to add that, looking for bundles helps a lot. If you have micro center near you, utilize it to you full advantage they are the only ones given promotional items with bundles at the moment from what seen. The main objective is to skip the price gouge of RAM chips, they cost more than the CPU at the moment. I got CPU and motherboard plus 32g RAM fro $600 and that was a save. the RAM was $445 alone.by mahirsaid
3/5/2026 at 3:55:40 PM
If I can use it with Linux in any meaningful way, that would be a better selling point.by giancarlostoro
3/5/2026 at 4:32:34 PM
You in fact can now! In the past week, a transformer framework called FastFlowLM [0] supporting XDNA 2 NPUs officially started supporting Linux.I posted it here the same day I found and started using it, to almost no reaction.
[0] https://github.com/FastFlowLM https://fastflowlm.com/ https://huggingface.co/FastFlowLM
by jakogut
3/5/2026 at 6:21:42 PM
> to almost no reaction.HN is overloaded with AI stuff, its hard to break through all the noise. I say this as someone very interested in AI. Even I skip some links because its just too much.
by giancarlostoro
3/5/2026 at 9:05:15 PM
I see it making claims about 10x efficiency, but how is tokens / second / watt? The only machines I've seen with the memory bandwidth to effectively do local inference are Mx arm chips on mac.by wing-_-nuts
3/5/2026 at 5:05:47 PM
because it's not faster than the Ryzen 395's GPU. power efficiency doesn't matter as much as TTFT for desktop users, especially when they're tasking their AMD box as a dedicated inference machine.some older pre-395 AMD articles suggested it'd be possible to use the NPU for prefill and the GPU for decoding and this would be faster than using either alone, but we have yet to see that (even on Windows) for any usefully sized models, just toys like LLaMA-8B.
by vyr
3/5/2026 at 3:48:45 PM
Now all the features you don't use can perform 20% faster!by SunshineTheCat
3/5/2026 at 3:13:23 PM
Upgrading to AM5 wasn't compelling to me even last summer before things went bonkers; I'm still very content with my 5800x and 64GB of DDR4.Trying to take the plunge on that now sounds like a nightmare.
by mikepurvis
3/5/2026 at 9:11:19 PM
I got lucky on the timing and got 9800X3D + 64GB DDR5 before prices increased.Your machine is sweet and probably runs just as fast on most tasks. I wouldn't be in n a hurry to upgrade.
I upgraded from an old Intel i7.
by hu3
3/5/2026 at 7:38:16 PM
To really take advantage of those gpu cores you need memory bandwidth. Modern transformer based LLMs are really bandwidth hungry. I am really happy to see this first push. NVIDIA having discrete GPU/memory/etc is an option, but not great for a lot of different reasons. Unified memory architectures like what AMD and Apple have are the way to go for the future. Put 256GB of ram on the main board and be able to access it at speed for LLM use please.by jmward01
3/5/2026 at 3:10:49 PM
Just like the previous generation of AI PC, consumers just need a usb/pcie NPU,Mass adoption won't happen until we get those cheap, because there are no mass prosumers making software for them that is massively popular.
by downrightmike
3/5/2026 at 3:42:50 PM
No, AI inference is mainly RAM/RAM speed constrained, we need more fast RAM to make local AI thrive.by u8080
3/5/2026 at 4:44:19 PM
Lol. Thanks to someone buying all the ram platters, before they became modules, that won't happen.by downrightmike