In analyzing both public and proprietary data about data centers as a whole, as well as the specific needs of AI, the researchers came to a clear conclusion. Data centers in the US used somewhere around 200 terawatt-hours of electricity in 2024, roughly what it takes to power Thailand for a year. AI-specific servers in these data centers are estimated to have used between 53 and 76 terawatt-hours of electricity. On the high end, this is enough to power more than 7.2 million US homes for a year.
If we imagine the bulk of that was used for inference, it means enough electricity was used on AI in the US last year for every person on Earth to have exchanged more than 4,000 messages with chatbots. In reality, of course, average individual users aren’t responsible for all this power demand. Much of it is likely going toward startups and tech giants testing their models, power users exploring every new feature, and energy-heavy tasks like generating videos or avatars.
Data centers in the US used somewhere around 200 terawatt-hours of electricity in 2024, roughly what it takes to power Thailand for a year.
By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth.
The researchers were clear that adoption of AI and the accelerated server technologies that power it has been the primary force causing electricity demand from data centers to skyrocket after remaining stagnant for over a decade. Between 2024 and 2028, the share of US electricity going to data centers may triple, from its current 4.4% to 12%.
The environmental impact of AI extends beyond high electricity usage. AI models consume enormous amounts of fossil-fuel-based electricity, significantly contributing to greenhouse gas emissions. The need for advanced cooling systems in AI data centers also leads to excessive water consumption, which can have serious environmental consequences in regions experiencing water scarcity.
The short lifespan of GPUs and other HPC components results in a growing problem of electronic waste, as obsolete or damaged hardware is frequently discarded. Manufacturing these components requires the extraction of rare earth minerals, a process that depletes natural resources and contributes to environmental degradation.
Additionally, the storage and transfer of massive datasets used in AI training require substantial energy, further increasing AI’s environmental burden. Without proper sustainability measures, the expansion of AI could accelerate ecological harm and worsen climate change.
All these estimates are hopefully nonsense. Deepseek cost basically nothing to train, AI uses basically no power run locally, especially on NPUs.
It’s only the Altman AI camp pushing “lets take what we have, not make it more efficient and scale it infinitely! Now shut down the dangerous competition and give me cash.”
Can you explain the first sentence like I’m 5?
https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/
https://iee.psu.edu/news/blog/why-ai-uses-so-much-energy-and-what-we-can-do-about-it
OK so tldr is “really bad”. God I didn’t know all of that
yeah :(
All these estimates are hopefully nonsense. Deepseek cost basically nothing to train, AI uses basically no power run locally, especially on NPUs.
It’s only the Altman AI camp pushing “lets take what we have, not make it more efficient and scale it infinitely! Now shut down the dangerous competition and give me cash.”