The idea of holding individuals and small organizations responsible for their carbon use is a deliberate eco-fakery invented by the fossil fuel industry. It does nothing, except in cases where it leads to purchasing “carbon offsets,” in which case it does nothing and also makes some scammer somewhere some money.
Most big changes that need to happen are on the industrial level (switching to different sources of electrical power or changing pollution regulations). They may have some impact on the end-user consumer, but mostly not. Mostly what it would mean is that some obscenely rich person still gets to be obscenely rich but not as much as they want to be.
(AI and cryptocurrency are rare arguable-exceptions where the power consumption is actually pretty significant and you can make a case that the individual involved in it bears some responsible for the impact. But again, the strategy should be for the individual to advocate for changing regulations, not for the individual to look inward towards themselves but turn a blind eye to everyone else who decides to murder the planet, if they want to because there’s some money in it for them.)
All data centers put together use about 2% of global electricity demand
Cryptocurrency is almost a quarter of that
AI is basically none of that right now, but likely to rise to be competitive with cryptocurrency in the pretty near future as it gets wider and wider adoption.
It is, though. Most computer tasks that a company does on behalf of their customers can be done with a little handful of web servers, all the way up until you get to Google’s scale of operations or something. The reason is that the actual computation the computers are doing is measured in milliseconds on one share of the multicore CPU. AI requires dedicated computing hardware and runs for much longer than that, which means the investment in equipment and how much of it you have to have is orders of magnitude larger. And training the model often takes a whole cluster or data center if you’re going to be a serious AI company. You go from needing 10-20 computers even at Reddit’s scale or something, to needing hundreds or thousands.
You’re right that it’s not some sort of magic computation that’s harder or more expensive than other computation, it’s just that it’s unusual (until now) to build out a whole data center that’s devoted to doing expensive pure computations on specialized hardware on behalf of your customers, and that’s gonna have an impact on how much power your operation consumes.
The idea of holding individuals and small organizations responsible for their carbon use is a deliberate eco-fakery invented by the fossil fuel industry. It does nothing, except in cases where it leads to purchasing “carbon offsets,” in which case it does nothing and also makes some scammer somewhere some money.
Most big changes that need to happen are on the industrial level (switching to different sources of electrical power or changing pollution regulations). They may have some impact on the end-user consumer, but mostly not. Mostly what it would mean is that some obscenely rich person still gets to be obscenely rich but not as much as they want to be.
(AI and cryptocurrency are rare arguable-exceptions where the power consumption is actually pretty significant and you can make a case that the individual involved in it bears some responsible for the impact. But again, the strategy should be for the individual to advocate for changing regulations, not for the individual to look inward towards themselves but turn a blind eye to everyone else who decides to murder the planet, if they want to because there’s some money in it for them.)
Crypto yes, AI does not come even close to 1% of that.
Hm, I was a little bit wrong about it – you’re right, AI is basically nothing right now. I am however generally convinced by this analysis.
AI isn’t inherently more energy demanding than any other program, most crypto is designed to be as inefficient as possible.
It is, though. Most computer tasks that a company does on behalf of their customers can be done with a little handful of web servers, all the way up until you get to Google’s scale of operations or something. The reason is that the actual computation the computers are doing is measured in milliseconds on one share of the multicore CPU. AI requires dedicated computing hardware and runs for much longer than that, which means the investment in equipment and how much of it you have to have is orders of magnitude larger. And training the model often takes a whole cluster or data center if you’re going to be a serious AI company. You go from needing 10-20 computers even at Reddit’s scale or something, to needing hundreds or thousands.
You’re right that it’s not some sort of magic computation that’s harder or more expensive than other computation, it’s just that it’s unusual (until now) to build out a whole data center that’s devoted to doing expensive pure computations on specialized hardware on behalf of your customers, and that’s gonna have an impact on how much power your operation consumes.