I'm dead serious about this. I think we should weigh all large scale compute against its carbon cost. Granted, fang is starting to look for renewable sources for its DC's, but US data centers alone will use around 73 billion kWh in 2020 [1].
How much of that is ML? As someone pointed out in another comment, how much ML is actually useful? Some ML is "carbon good", such as route planning saving energy. But do we really need to spend billions of kWh just to get slightly better recommendations? Do we really need to increase margins a fraction of a percent for some company to show more ads and sell more?
And while we're on the subject of power, maybe if web pages weren't 300mb of crap and 1k of content, we could cut back on another few billion kWh on servers and routers.
This shit is serious, we're dying here, so yes, absolutely, let's do the math about how much AI costs. It's up to us, the computer people, to ask these questions and solve the part of this problem that WE own.
It literally doesn't matter if it's co2 or not that powers it. Even if it's all wind, that energy could have gone somewhere else. We are wasting energy, by a lot.
Fusion won't saves, there is already evidence that every extra energy we produce is used, having more green energy doesn't reduce fossil fuels energy consumed, it just makes more energy consumed overall.
Don't waste energy.