Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I find it weird that on the whole there wasn't as huge a pushback against AWS and Google and Facebook (by far the biggest fleets of data centers) for power and carbon emissions over the last couple decades. Maybe they managed propaganda better about paying for future renewable resources or how low their PUEs are.

Turning 4% of the U.S.'s electricity into cat videos and online shopping and advertisements and heat already sounds like a lot of use. Maybe the rapid rise of AI use is what's alarming people?

Contrast the 2016 study[0] of data center energy use where use was recently flat because of efficiency improvements in 2010-2020 but historically there was a ton of growth in energy consumption since ~1990; basically we have always been on a locally exponential growth curve in data center energy use but our constant factors were being optimized by the hyperscalers in that 2010-2020 period.

We also need to compare the efficiency of AI with other modes of computation/work. The article goes into detail on the supposed actual energy use but there's a simple metric; All the large companies provide costs per unit of inference which can put a hard ceiling on actual energy cost. Something like $20/1M tokens for the best models. METR used a 2M token budget. So you can currently price out N hours of work at $40 from whichever latest METR benchmarks come out and have a worst case cost for efficiency comparison.

Lastly, if we're not on a trend toward having Dyson swarms of compute in the long run then what are we even doing as a species? Of course energy spent on compute is going to grow quadratically into the distant future as we expand. People are complaining about compute for AI but compute is how we figure things out and get things done. AI is the latest tool.

[0] https://eta.lbl.gov/publications/united-states-data-center-e...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: