4/1/2025 at 8:26:31 AM
Just out of curiosity, I wish online LLMs would show real-time power usage and actual dollar costs as you interact with it. It would be so insightful to understand to which degree the technology is subsidized and what the actual value/cost ratio is.I've read somewhere that generating a single AI image draws as much power as a full smartphone charge.
In case the suspicion is true that costs are too high to be monetized, then the current scale-up phase is going to be interesting. Right now people infrequently have a chat with AI. That's quite a different scenario from having it integrated across every stack and it constantly being used in the background, by billions of people.
Late as they may be, for the consumer space I think Apple is clever to push as much as possible to the local device.
by iteratethis
4/1/2025 at 8:57:33 AM
Also out of curiosity, I did some quick math regarding that claim you read somewhere.Cellphone battery charge: I have a 5000mAh cellphone battery. If we ignore charging losses (pretty low normally, but not sure at 67W fast charging)... That battery stores about 18.5 watt-hours of energy, or about 67 kilojoules.
Generating a single image at 1024x1024 resolution with Stable Diffusion on my PC takes somewhere under a minute at a maximum power draw under 500W. Lets cap that at 500*60 = 30 kilojoules.
So it seems plausible that for cellphones with smaller batteries, and/or using intense image generation settings, there could be overlap! For typical cases, I think that you could get multiple (but low single digit) of AI generated images for the power cost of a cellphone charge, maybe a bit better at scale.
So in other words, maybe "technically incorrect" but not a bad approximation to communicate power use in terms most people would understand. I've heard worse!
by Saigonautica
4/1/2025 at 9:59:26 AM
Your home setup is much less efficient than production inference in a data center. Open source implementation of SDXL-Lightning runs at 12 images a second on TPU v5e-8, which uses ~2kW at full load. That’s 170J or about 1/400th the phone charge.https://cloud.google.com/blog/products/compute/accelerating-...
by grandmczeb
4/1/2025 at 8:46:18 PM
These models do not appear from thin air. Add in the training cost in terms of power. Yes it's capex and not opex, but it's not free by any means.Plus, not all these models run on optimized TPUs, but mostly on nVIDIA cards. None of them are that efficient.
Otherwise I can argue that running these models are essentially free since my camera can do face recognition and tracking at 30fps w/o a noticeable power draw since it uses a dedicated, purpose built DSP for that stuff.
by bayindirh
4/1/2025 at 9:02:50 PM
GPU efficiency numbers in a real production environment are similar.by grandmczeb
4/1/2025 at 9:08:47 PM
I doubt, but I can check the numbers when I return to the office ;)by bayindirh
4/2/2025 at 11:58:45 AM
Oh, that's way better! I guess the comparison only holds as approximately true with home setups -- thanks for the references.by Saigonautica
4/1/2025 at 9:31:10 AM
My PC with a 3060 draws 200 W when generating an image and it takes under 30 seconds at that resolution, in some configurations (LCM) way under 10 seconds. That's a low end GPU. High end GPUs can generate at interactive frame rates.You can generate a lot of images with the energy you would use to play a game instead for two hours; generating an image for 30 seconds uses the same amount of energy as playing a game on the same GPU for 30 seconds.
by elpocko
4/1/2025 at 11:17:15 AM
One point missing from this comparison is that cell phones just don’t take all that much electricity to begin with. A very rough calculation is that it takes around 0.2 cents to fully charge a cell phone. You spend maybe around $1 PER YEAR on cell phone charging per phone. Cell phones are just confusingly not energy intensive.by gdhkgdhkvff
4/1/2025 at 11:55:06 AM
And for reference, it takes around $10/year to run a single efficient indoor LED lightbulb. So charging a cell phone for a years-worth of usage costs less than 1/10th of running an efficient LED lightbulb bulb for the full year.Again, cell phones are just confusingly not energy intensive.
by gdhkgdhkvff
4/1/2025 at 10:49:15 AM
How about if you cap the power of the GPU? Modern semiconductors have non-linear performance:efficiency curves. It's often possible to get big energy savings with only small loss in performance.by mrob
4/1/2025 at 8:23:58 PM
> Generating a single image at 1024x1024 resolutionThat's not a very big image, though. Maybe if this were 25 years ago
You should at least be generating 1920x1080, pretend you're making desktop backgrounds from 10 years ago
by bluefirebrand
4/1/2025 at 9:14:14 AM
> Generating a single image at 1024x1024 resolution with Stable Diffusion on my PC takes somewhere under a minute at a maximum power draw under 500WThat's insane, holy shit. That's not even a very large image.
Apparently I was off on my estimates about how power hungry gpus are these days by an order of magnitude.
by facile3232
4/1/2025 at 11:57:42 AM
Why is that "insane?" Drawing the same image in Photoshop, or modeling and rendering it, in the same quality and resolution on the same computer, would require much more time and energy.by elpocko
4/1/2025 at 12:14:27 PM
> Drawing the same image in Photoshop, or modeling and rendering it, in the same quality and resolution on the same computer, would require much more time and energy.Right, but there was a point at which we could stop people from doing stupid shit because it's useless and they're bad with money. Now it seems we've embraced irrational and misanthropic spending as a core service.
We honestly just need to take money away from people who obviously have no clue what to do with it. Using AI seems like a perfect signal for people who have lost touch with an understanding of value.
by facile3232
4/1/2025 at 9:29:18 AM
4090s literally melted their power connectors... https://videocardz.com/newz/nvidia-claims-melting-connector-...by baq
4/1/2025 at 9:17:35 AM
A 1024x1024 image seems like an unrealistically small image size in this day and age. That’s closer to an icon than a useful image size for display purposes.by nkrisc
4/1/2025 at 9:19:52 AM
I think you're being hyperbolic. On a 1080p screen that's almost the entire vertical real estate. You'd upscale it if you're going to actually use this thing for "useful purposes" like marketing material, but that's not an icon.by aqme28
4/1/2025 at 10:49:38 AM
A bit, I do admit. But given the ubiquity of 2k+ screens I don’t think it’s entirely hyperbolic. Closer to an icon in size, I meant, not necessarily usage.by nkrisc
4/1/2025 at 11:46:05 AM
They're not nearly as common as you think.1920x1080 is still, by far, the dominate desktop and laptop resolution in 2025.
by esseph
4/1/2025 at 1:00:46 PM
> I've read somewhere that generating a single AI image draws as much power as a full smartphone charge.To put that in perspective, using the 67 kJ of energy for a smartphone charge given in Saigonautica's comment you can charge a smartphone 336 times for $1 if you are paying the average US residential electricity rate of just under $0.16/kWh.
You could charge a smartphone 128 times for $1 if you were in the state with the most expensive electricity (Hawaii) and paying the average rate there of around $0.42.
Saigonautica's battery is on the large size. It's a little bigger than the battery of an iPhone 16 Pro Max. A plain iPhone 16 could be charged 470 times for $1 at average US residential electricity prices.
For most people energy used to charge a smartphone is in the "this is too small to ever care about" category.
We can do a similar calculation for AA rechargeable batteries, and the results might be surprising.
$1 of electricity at the US average residential rate is enough to recharge an AA Eneloop nearly 2300 times. Of course there are inefficiencies in the charger and charging, but if we can get even 75% efficiency that's good enough for more then 1700 charges.
That really surprised me when I first learned it. I knew it wasn't going to be a lot...but 1700 charges is I think more than the number of times I'll swap out an AA battery over my entire lifetime. I hadn't expected that all my AA battery use for my whole life would be less than $1 worth of electricity.
by tzs
4/1/2025 at 9:41:41 AM
> It would be so insightful to understand to which degree the technology is subsidized and what the actual value/cost ratio is.it would be insightful for competitors too, because they could use this as part of their analysis and price strategies against you.
Therefore, no company would possibly allow such data to be revealed.
And in any case, if these LLM providers burn cash to provide a service to you, then you ought to take maximal advantage of it. Just like how uber subsidized rides.
by chii
4/1/2025 at 8:44:26 AM
feel like if they did this the whole AI bubble would popby polytely
4/1/2025 at 9:23:06 AM
It's not just Apple integrating AI into the hardware, Microsoft has been part of a big push to "AI PCs" with a certain minimum capabilities (and I'm sure their partners don't mind selling new gear) and the copilot button on keyboards, and certain android models have the processors and memory capacities specifically for running AIby keyringlight
4/1/2025 at 12:45:00 PM
> It would be so insightful to understand to which degree the technology is subsidized and what the actual value/cost ratio is.For whom would this be beneficial? The design goals of these products are to get as many users as fast as possible, using it for as long as possible. "Don't make me think" is the #1 UX principle at work here. You wouldn't expect a gas pump terminal to tut-tut about your carbon emissions.
by rchaud
4/1/2025 at 10:18:23 AM
How much energy does it cost for a human to generate an image?by kosh2
4/1/2025 at 10:35:29 AM
You mean, how much extra energy, compared to what the human was going to do instead? It might be a negative amount. But that might be a bad thing, an artist could get fat.by card_zero