Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is where the lack of imagination comes in (not you in particular, this is everyone right now). I'm postulating something non-obvious and pretty contentious, but I think compute should always be more valuable than the cost of the power it consumes.


That's an obviously wrong statement: at some point of power consumption, you've burned the earth to ashes, and there's no returns on compute that can pay that off.


The economics of efficiency come in. You can say it is lack of imagination and all that, but the graveyard of bitcoin miners would disagree. Yes we might come up with some novel usage for computation that would somehow exceed the power costs, but like, that is not an easy problem to do even at some very low margin rate. CPU and computation is cheap and linear. LLMs are one of the few things that produce value for all that computation in recent times outside of the normal workloads we have always been progressing (e.g. serving sites, etc.).


I can't see how that edge case actually changes the practical value of the original statement, though.


If you spend 1000W on 1 gigaflops, but could have gotten 1 teraflops instead on newer hardware, you are mostly just throwing money away. Unless fab capacity is severely limited in the future or energy becomes too cheap to meter, the opportunity cost is just too great for your statement to be true.


Even if fab capacity becomes unconstrained, you'd have to buy newer GPUs to take advantage of the more efficient processes.


Consider the extrema: You have a Pentium P5 running at 16W doing something like 70 MIPS (which isn't quite producing 70 MFLOPS), getting something like 4 MFLOP/Watt.

Then take a Titan V at 14.9 TFLOPs (32 bit) at about 250W, for 59,000 MFLOP/Watt.

There's almost no conceivable world in which it's worth running the P5. It literally consumes ten thousand times as much power per unit compute as a not-quite modern GPU.

For compute to "always" be more valuable than the cost of the power it consumes, the value of that compute would have to be infinite. We have no such application. And I suspect we're unlikely to. :-)


Computations will usually not be done if the result is less valuable than the cost. This is true of everything, not just compute.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: