Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It can cost a lot to run a GPU, especially at full load. The 4090 stock pulls 500 watts of power under full load[0], which is 12 kWh/day or just under 4380 kWh a year, or over $450 in a year assuming $0.10-$0.11/kWh for average residential rates. The only variable is whether or not training requires the same power draw as hitting it with furmark.

0: https://youtu.be/j9vC9NBL8zo?t=983



> $0.10-$0.11/kWh for average residential rates

you Americans don't know how good you have it...


That’s a cheap rate for sure. Southern California is $.36/.59/.74 peak. Super expensive.


Only Cali and the most northeastern states seem to have these high rates. Every other continental state is under $0.14 https://www.eia.gov/electricity/state/


Southern California? Time to buy some solar panels!


Imagine someone paid you 25c/hour for 4090 compute sharing.


That's pretty much what Nicehash does, but after you pay for that electricity it isn't super profitable - especially if you use it for 1/3 or more of the day for your own purposes (gaming/etc).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: