AI Data Centers Could ‘Damage’ the Electricity Supply in US: Report
The growing demand for artificial intelligence (AI) has led to a rise in the number of data centers built across the United States to support the computational power required for these technologies. However, the increased energy consumption of these AI-focused facilities is reportedly having a significant impact on the nation’s power grid, potentially putting millions of Americans at risk. A recent report by Bloomberg sheds light on a concerning correlation between the growth of data centers and deteriorating power quality.
The study, using data collected from over a million residential sensors tracked by Whisker Labs and additional market insights from DC Byte, revealed that more than 75% of the most severe power distortions in the U.S. occurred within 50 miles of significant data center operations. The report also found that over half of the households experiencing the worst power irregularities were situated within just 20 miles of major data centers.
These power distortions, also known as “bad harmonics,” can have serious consequences. Prolonged exposure to inconsistent power quality can damage electrical appliances, increase the likelihood of electrical fires, and contribute to blackouts and brownouts. Such risks are particularly pressing in areas where AI data centers are present due to their unpredictable and often higher energy consumption, which adds further strain to power grids already operating near capacity.
AI data centers power consumption
According to a Deloitte report released in November 2024, AI data centers are forecasted to use 90 terawatt-hours (TWh) of energy each year by 2026, marking a tenfold rise from 2022. Deloitte’s assessment suggests that continuous improvements in AI and data center processing efficiency could drive energy consumption to approximately 1,000 TWh by 2030. On average, a prompt request for generative AI consumes between 10 and 100 times more electricity than a conventional internet search.
Experts are raising alarms about the growing energy demands of AI-driven data centers. Some estimates suggest that AI facilities could require up to 5 times the amount of electricity used by traditional data centers, significantly increasing the burden on local power infrastructure. As Bloomberg notes, these concerns have led to pushback from some utilities, including Commonwealth Edison, a major utility provider in Chicago, which has questioned the accuracy of the data from Whisker Labs and its reflection of the complexities within regions heavily affected by data center activity.
Growing demand of AI Data Centers
Meanwhile, as these energy-intensive AI centers continue to multiply, major tech companies are investing heavily in solutions to secure sustainable energy. Microsoft, which has poured billions into AI technologies, including its $13 billion stake in OpenAI, is reportedly working with investors like BlackRock to fund a $100 billion energy infrastructure project. In comparison, other tech giants are following suit in addressing the energy needs of their data centers. Google, for example, has started exploring nuclear power as a potential energy source to support its expanding AI operations.
ALSO READ – Massive Pay Gap: CEO Salaries Up 160% While Freshers Get Only 4% in 5 Years – Report
Furthermore, companies such as Amazon and Google are heavily investing in new data center infrastructure. Google, in particular, announced a $3 billion investment to build and expand facilities in Virginia and Indiana, while Amazon pledged $10 billion to develop data centers in Ohio. With the growing strain on the power grid from these expanding data centers, there is a clear need for careful consideration of energy sustainability and grid stability. As AI continues to drive technological advancements, finding solutions that balance these developments with reliable, sustainable energy sources will be crucial for maintaining both progress and public safety.