Researchers in the journal Joule published a study showing that AI’s energy consumption already makes up about 20 percent of the global data-center power demand. By the end of the year, this demand from AI could potentially double, making up nearly half of all data-center electricity usage worldwide. This excludes the electricity used for bitcoin mining. The commentary, authored by Alex de Vries-Gao, the founder of Digiconomist, a research company that assesses the environmental impact of technology, highlights the urgent need to analyze AI’s energy consumption due to the increasing adoption of energy-intensive models like ChatGPT and other large language models.

According to de Vries-Gao, the money invested in AI by tech giants like Google and Microsoft far surpasses the funds put into bitcoin mining. This rapid escalation poses a significant threat, surpassing the energy demand from bitcoin mining by the end of the year. The rise in AI development is already affecting the climate goals of Big Tech companies, with Google reporting a 48 percent increase in greenhouse gas emissions since 2019. This surge in energy use makes it challenging for companies like Google to achieve their net-zero targets by 2030. The International Energy Agency predicts that data centers’ electricity consumption will more than double by the end of the decade, driven by massive expansions to accommodate new AI capacity.

However, there are still many unknowns regarding the exact share of AI in the current electricity consumption of data centers. Data centers support various services beyond AI that don’t require the same level of energy consumption. Attempts to quantify AI’s energy usage have focused on the user side, estimating the electricity consumption of individual AI searches. De Vries-Gao took a different approach, analyzing the production side of AI hardware to obtain a more comprehensive view. The high computing demands of AI create a bottleneck in the global supply chain, particularly around companies like the Taiwan Semiconductor Manufacturing Company (TSMC), a key producer of AI hardware. Without increased production capacity, AI’s electricity consumption could reach 82 terrawatt-hours this year, equivalent to Switzerland’s annual electricity consumption.

Sasha Luccioni, an AI and energy researcher, emphasizes the importance of disclosure from tech companies to accurately calculate AI’s energy usage. Despite the publicly available information used in the study, there are still significant unknown factors that impact AI’s energy consumption. Google’s previous paper on machine learning and electricity use indicated that machine learning accounted for 10%-15% of Google’s total energy use from 2019 to 2021. However, the lack of detailed information on electricity usage since the paper’s release raises questions about the industry’s transparency. De Vries-Gao stresses the need for tech companies to provide more information on their energy consumption to accurately assess AI’s impact on electricity demand.