alt.hn

4/1/2025 at 8:35:15 PM

The Nvidia DGX Spark Is a Tiny 128GB AI Mini PC Made for Scale-Out Clustering

https://www.servethehome.com/the-nvidia-dgx-spark-is-a-tiny-128gb-ai-mini-pc-made-for-scale-out-clustering-arm/

by PaulHoule

4/1/2025 at 10:21:51 PM

The upcoming wave on APU like minipcs will be really cool in general.

The mem throughput looks a tad on the low side but combined with MoE style models will still allow for big models to run at reasonable speeds.

Prices will need to drop though. A grand is likely closer to most people budget than 3 for an AI quasi toy.

by Havoc

4/1/2025 at 11:45:20 PM

The original Apple I computer was released in 1976 and sold for $666.66, which is $3,725.38 in Feb 2025 adjusting for inflation.

https://data.bls.gov/cgi-bin/cpicalc.pl?cost1=666.66&year1=1...

by fragmede

4/2/2025 at 11:08:45 AM

Good point. I do think expectations have shifted though on electronics. If you look at say big TVs for example. Those went from a really big purchase to just an accessory basically

by Havoc

4/2/2025 at 12:23:02 AM

Are these just good for LLM inference or can they be used to train stuff like CV models too? (Let's say vs. a 5090 which is same ball.park price-wise)

by captaindiego

4/2/2025 at 5:31:12 AM

From my experience LLM inference really, really likes memory bandwidth, which at 1.79 TB/s the 5090 has quite the lead on the APU's 273GB/s

by banderwidthdk