How AI is Ruining the Electric Grid
- Artificial intelligence causes significant harmonic distortions, affecting the quality of electricity.
- The power supply near AI data centers experiences higher levels of noise and fluctuations that impact household electronics.
- AI data centers place enormous demands on the electric grid, straining resources and raising concerns about reliability.
- The shift towards more energy-intensive AI systems could push the grid to its limits, potentially leading to blackouts.
- Local communities are becoming increasingly cautious about the environmental impact and resource consumption of data centers.
Artificial intelligence is making your electricity worse. You might not have even known that that was a possibility for your power to be good or bad. We typically experience electricity in a binary sense of available or not. But it can be judged in a qualitative sense as well. And the electricity near AI data centers, well, it’s bad.
This is what your power is supposed to look like—at least in North America. A 60 Hz sine wave. AI, however, is causing it to look a bit more like these are harmonic distortions—deviations from perfection. Now, power is never perfect. There’s always some harmonic distortion. But there’s a threshold that is considered acceptable—8%.
When a household is near a major data center, research has indicated that the likelihood that readings exceed that threshold goes dramatically up. This has implications. Your refrigerator, for example, includes a motor to operate its compressor. When the current is not smooth, the fluctuations in power will lead to fluctuations in torque as it spins around its axis. Meaning in addition to the rotational force it was intended to create, it’ll also create oscillations as it spins.
As these oscillations interact with the motor casing and surrounding components, it creates noise—a rattling sound. So that’s to say AI is making your refrigerator louder. And that noise is more than just annoying; it’s representative of mechanical stress that will lead that motor to die out sooner, spoiling your groceries and mandating a costly, cumbersome fix. The same applies to all the rest of your electronics.
The 8% harmonic distortion threshold is set since it’s what normal household appliances can handle without shortening their lifespans. So incrementally and ever so slightly, AI is costing you money by making your electronics wear down faster. It’s no secret that artificial intelligence in its current form is tremendously energy intensive.
That undoubtedly matters. But demand is potentially the simpler problem to solve—especially when ignoring carbon goals. The issue is not just how much power AI uses, it’s how and where it uses it, making AI a net positive for everyday Americans. In particular, considering the tech sector's concentration in the country will be an enormous challenge—not just due to the potential implications on employment and creativity, disinformation, and more, but also just due to the sheer strain on the electric grid.
This decade's most revolutionary technological breakthrough is housed within the nondescript walls of buildings like this spread across the US and the world. These are AI data centers. Now, data centers, the generic kind, are really nothing new. Take Google for instance. They opened this one in 2003, this one in 2007 and this one in 2009, just to name a few.
From harboring personal emails and documents saved to Google’s cloud to powering Google Maps or hosting third-party virtual machines or apps. It’s this network of brick-and-mortar data centers that’s providing the physical infrastructure for Google to become a one-stop shop for all things cloud computing for decades now. And like a simple Google search or Gmail response relies on these data centers, so too does an AI query.
Where the difference lies between a cloud computing data center and an AI data center is in the chips—processing units at the heart of them. In the middle of 2024, the title of the world’s most valuable public company belonged not to Microsoft or Apple, but to Nvidia, a company historically recognized and associated with its ties to video game development. From a $400 billion market cap to $3 trillion in a couple of years, the astronomical rise of Nvidia was largely attributable to what they’re able to supply for AI data centers—the H100 GPU chip.
GPUs, or graphic processing units, are what make AI possible. At the simplest level, these chips are capable of working through all the tasks laid out by AI at the same time through parallel processing, meaning there are thousands of cores—each. Effectively a tiny calculator can solve thousands of equations at the exact same time; tasks that a core processing unit just can’t do at the same scale.
If a CPU, which is what you’ll typically find running the show within a data center, is an electric car capable of taking care of one process extremely quickly and efficiently, then a GPU is a semi-truck getting a whole lot of tasks done at once, but taking a good bit of gas to do it. Powering such semi-trucks like Nvidia’s H100 takes a lot of energy. At peak power consumption, an H100 consumes 700 watts, not far off from a small household appliance like an air fryer.
But when run at 61% annual utilization for an entire year, this consumption starts to creep up to about the consumption of a typical American household. Pushing demand higher is the fact that to run any sort of AI infrastructure, a data center needs tens of thousands of these GPUs to run even the simplest of large language models. Now the power required begins to look like that of a small city, and the demand for more energy is set to grow.
While the far more powerful B100 and B200 chips use roughly the same amount of energy as the H100 AI models are asking more and more of the physical infrastructure; take the total amount of compute required to train a current AI model. This is measured in flops, floating point operations per second—or really, Petaflops, which is 1,000,000,000 flops.
This is a graph of required compute per model. Consider the y-axis that makes this possible to even plot growth is exponential. GPT-1, for instance, required 18,000 petaflops, while GPT-4 required some 21 billion. When or if the demand or the technological advancements that make such data processing possible begin to slow is difficult to tell.
But in the meantime, as models consume orders of magnitude more data, the sheer power demand will continue to rise. With it, the very core of large language models, the chips themselves are only getting more and more capable. So while issues of latency on account of physical spacing, power supply or even data availability represent real constraints and challenges, companies are racing one another to get around them and pushing the technology forward in doing so.
But higher total demand for electricity is only part of what makes AI data centers difficult to deal with on the ground. Actually producing the power is only a component of what has experts concerned. The world was already in the midst of a transition towards renewable energy sources, meaning there’s a good bit of new generation capacity coming online that was intended to replace older, less efficient carbon-intensive production like coal power plants.
Keeping coal power plants online represents a quick and easy fix from the perspective of AI companies and grid operators. Therefore, how much power AI consumes is far from the only challenge—at least when ignoring the climate. How and where AI consumes power is equally tricky. Where typical data center operations optimize for steady, consistent energy usage, AI’s consumption is defined by moments of massive intense demand to make more robust and capable models that work faster and cheaper.
AI racks composed of the physical hardware, the GPUs, the cables, servers, and storage are getting denser. To minimize latency and minimize the use of expensive high-speed cables, designers are jamming more GPUs in each and every rack. This makes each more power-hungry overall, especially as these GPUs are only becoming more capable.
But it also raises the total peak consumption of each rack—a limit that is anything but theoretical. When it comes to the computing demands of artificial intelligence, AI operations can be split into two: there’s inference and there’s training.
Inference is what the consumer uses—the finished operational model that allows an individual to plug in their question to ChatGPT and get an answer. In inference, the computing demand and thus power demand is determined by user traffic. If people are hammering away at requests, it can strain the system and lead to a series of reflections on the true cost of saying please and thank you to ChatGPT. But generally, inference represents the steadier side of power demands.
Then there’s training—the process of feeding massive amounts of data to the models to prepare them for inference and user interaction. Training may only represent about 20% of AI's total workload, but it is far and away the most intensive task of any AI data center to train these large language models as quickly and efficiently as possible. It’s all hands on deck, requiring all GPUs and subservient hardware working at maximum capacity.
This represents a major deviation from data center operations of the past, where demand and tasks were once spread to smooth the peaks and troughs. The parallel work done by clusters of GPUs creates a synchronous demand that pushes each individual chip, along with the whole entire network, to peak power demand, pushing up against the very edge of operational limits.
This rather unique synchronous demand from AI data centers has grid operators nervous. That’s because, in the electric grid system, there is very, very little slack. Demand must match supply, and supply must match demand. As any imbalance between the two is what creates poor quality power. It’s therefore an issue when demand drops off or ramps up suddenly and unexpectedly. And the more concentrated a power draw is, the higher the risk of it coming on or offline creates an issue.
And there have been warning shots. On July 10, 2024, on an unidentified transmission tower in Northern Virginia, a lightning arrestor failed. These are the devices designed to divert voltage caused by a lightning strike on a transmission tower away from the transmission wire, preventing a voltage spike in the grid itself.
In this case though, the lightning arrestor failure caused the surrounding section of line to fault and lock out. This issue manifested itself with a 42 millisecond depression in the grid’s voltage, followed by five subsequent dropouts over the following 82 seconds as the grid attempted to fix itself. Now that should have been the end of it. Keep that section of line deactivated, call out a repair crew, replace the lightning arrestor, and by the time the next day's peak afternoon demand came around, everything would have been fine again. That is, if not for Data Center Alley.
You see, Northern Virginia is home to the world’s largest concentration of Data Centers. The GPUs and servers and cooling systems within these facilities are highly sensitive to grid disruption. Even the slightest voltage spike or harmonic distortion can cause damage or misoperation. Therefore, just about every data center includes a fault detection system that, when an issue occurs, seamlessly switches the power supply over to a battery bank with a couple of minutes of capacity.
In the time that buys a backup generator, then fires up for longer-term supply. This has regulators worried. While the effect of one facility’s demand disappearing from the grid would be negligible, it’s different when a grid fault occurs in a major data center cluster like Northern Virginia. The effect of all these data centers dropping off the grid simultaneously led to an almost instantaneous loss of 1,500 megawatts of load.
1,500 megawatts is enormous. That’s like if all of Iceland, Namibia, or Jamaica went offline. Grids are designed to adapt to demand. But they can only adapt so fast. When simplified, a grid works with two types of generation capacity: baseload and peak load. Conventional baseload generators are typically the least expensive, most cost-effective facilities that operate more or less 24/7—nuclear, coal, hydroelectric, and certain designs of natural gas.
With each of these, startup and shutdown times are long. But that’s why there are peak load sources. They turn on and off in response to shifting demand throughout the day. The most common form of these are certain other costlier and less effective designs of gas power plants. Often these can start up in under five or so minutes, shut down even faster, and have the ability to throttle output.
But it’s never instant. Normally that’s not an issue as the aggregate demand of a whole region's power only increases or decreases so fast. But it is an issue when a whole Iceland's worth of data centers simultaneously shut off. When 1,500 megawatts of demand disappears, there’s only so much a grid operator can do.
They’ll shut down peaker plants as quickly as possible; they might have some demand response programs that get industrial users to pull power. If they have battery storage systems, they’ll set them into charging mode. But still, 1,500 megawatts is a ton. And in this July 2024 instance, the grid operator was incapable of responding fast enough.
Physics dictated the response. The power frequency rose from its intended 60 Hz up to 60.047. Now, a deviation of 0.0047 is not by itself a problem; it’s only a bit beyond what this grid operator considered standard. The reason this incident got regulators’ attention is because of what it demonstrates could happen in the future.
When AI data centers grow larger and move into more isolated areas, the proportion of demand that could disappear in an instant will grow larger—and so too will the consequences of their impacts on the grid. A frequency deviation much greater than 0.047 could start to trip protective systems across the grid, leading to further frequency deviations. And it’s these sorts of cascading sequences of problems that lead to major region-wide blackouts.
While not the full cause, cascading frequency issues such as this were a major component of the 2025 blackout in Spain and Portugal as well as other notorious blackouts like Texas in 2021 and the American Northeast in 2003. Now, issues such as frequency deviations and harmonic distortions are not new. They’ve long been potential consequences of large, power-hungry industrial sites.
And it’s for this very reason that the development of large power-hungry sites such as AI data centers is capped by grid operators. A developer needs to apply to hook up a new site to the grid so that the grid operator can assure that there’s enough generation and transmission capacity. In the case of Loudoun County, home to Data Center Alley, there’s not—at least not yet.
That’s why there’s currently a four to seven-year backlog for connection requests as the grid operator races to upgrade its infrastructure. And there’s more; the voter base in the county is pushing back. Whereas in the past the county actively pursued data centers as a source of potential economic development, community opinions have soured due to grid and environmental impacts, the facilities’ unsightliness, and the relative lack of jobs.
To balance out these downsides, the county has long had a policy allowing for near automatic approval of new data centers. But earlier in 2025, the Board of Supervisors voted to reverse that. Now, new data center developments require their approval, likely leading to restrictions on future development. This is a story playing out across the country. Traditional data center clusters are getting congested.
Whereas in the past, it was appealing to build new facilities in these clusters close to users, clients, and talent, now as they strain resources, it’s growing more appealing to build a new data center as far away from other data centers as possible. Certain places view this as an opportunity. Places like Mesa, Arizona.
This is Elliot Road in 2014. It was a barren flat five-mile stretch of desert. But it was a barren flat five-mile stretch of desert sitting near two airports, paralleling a fiber optic cable and three high voltage transmission lines, along with a potential labor pool of more than 800,000 within a 30-minute drive.
This is Elliot Road today, now more commonly known as the Elliot Road Tech Corridor thanks to a rezoning process that categorized the neighborhood as light industrial. It allowed applicants—it actually encouraged, incentivized, and attracted them in the first place—to get their approvals. Fast track to just six weeks.
One of the first to line up for these entitlements was naturally tech pioneer Apple, who announced a $2 billion investment into 1.3 million square feet of facilities. Other tech companies followed their lead, and now Elliot Road is a Silicon Valley outpost—yet only in representation. Here, strategy isn’t being set and products aren’t being designed. These are primarily data centers, and big ones, totaling billions of square feet of development.
That output is powered by a combination of factors. In this Arizona desert, land is cheap and the power is even cheaper, thanks to the state’s diversification of generation between nuclear, natural gas, hydropower, and renewables. Elliot Road, and more broadly, Phoenix, is uniquely positioning itself to meet the growing demand for data centers and the power they use.
It’s not only about sourcing the electricity, of which there is plenty in Arizona. It’s about the infrastructure to transport and store that power, all critical to AI’s volatile energy use patterns. That’s where companies like Phoenix’s Salt River Project, a utilities cooperative, are stepping up. Salt River Project is rising to meet demand—a developing project such as the addition of a 250 megawatt lithium-ion battery storage system on Elliot Road, as well as seven new miles of overhead 230 kilovolt power lines that were completed in 2024.
What’s happening on Elliot Road, though, is the version of this where a municipality can plan ahead, predict and try to meet growing energy needs. And yet there is no best place for data centers—they need land for their massive size, they need power to make them run, and they need water to keep them cool. Across the United States, there are some 5,000 data centers, accounting for 4% of the country’s energy use—according to the Department of Energy.
One center can theoretically use as much energy as 50,000 homes. And those statistics are essentially guaranteed to grow as the AI sector does. Data centers’ proportion of US usage is expected to more than double by 2030. While right now Arizona is all for data center development, who’s to say how voters will feel when the next severe drought comes along?
And data centers are gobbling up water while residents live through user restrictions and food prices going up as farmers cut back. Increasingly, tech companies are coming to the conclusion that they can’t rely on sharing—they need to own their resources. Tech companies are prospecting all over the United States to find data center homes; it’s in their best interest to keep facilities stateside for security reasons.
But there are finite resources, even in the sprawl of America. In places like Pennsylvania, individual companies such as Microsoft are strategizing for the future by developing their own power sources instead of relying on municipalities. The Tecomy entered a 20-year partnership with Constellation to purchase 835 megawatts of generating capacity from the dormant Three Mile Island nuclear reactor.
There are actually two units at Three Mile Island, here and here. Constellation operated Unit 1 until 2019, when it closed down due to failing economics. But times have changed. Nuclear is a more expensive electric option, especially considering it will take a $1.6 billion investment just to bring the plant back online. But it also produces firm power, which doesn’t fluctuate the same way renewables do.
Microsoft is not alone. Amazon announced a partnership with Energy Northwest to develop small modular reactors. And Meta and Google have announced similar initiatives. The potential power demand of AI is just so great that the largest tech companies are bypassing the grid entirely.
But AI is not just Microsoft, Meta, and Google. It’s a nascent behemoth that will be made up of countless companies, both big and small. Meaning regardless of how much dedicated capacity the giants procure, plenty of data centers will still be linked to the grid. It’s largely inevitable at this point that AI will contribute to dramatically increased global power demand and therefore contribute to accelerated climate change.
It is, however, possible that some upside could come from this massive downside. That’s because the type of grid upgrades needed to accommodate AI’s energy demand are also the type needed to make the grid more compatible with renewable energy sources.
For example, the solution to the conventional grid’s lack of responsiveness to fast demand increases or decreases is massive battery banks like those being built in Mesa, Arizona. While peaker plants take minutes to fire up or shut down, battery banks can deliver all their output effectively in an instant, preventing the kinds of frequency deviations that could cascade into blackouts.
These battery banks are also what are needed to make solar and wind power more reliable—charging up when it’s windy and sunny and discharging when it’s not. The more battery storage capacity there is, the more renewables can act as baseload. The same can be said for transmission lines. Renewables are geographically dependent, so greater long-distance transmission capacity can transport power from windy or sunny places with ample open space to densely populated areas.
And they can also transport that power to data centers in the places where operators want them—near the consumer. With ample water and labor resources, more transmission capacity means the concentration of data centers is less of an issue. There currently is not adequate incentive to transition the grid to one more compatible with renewables, at least fast enough, because grid operators don’t pay for the cost of climate change—they do pay for the opportunity cost of not capturing the incoming demand from AI.
So the industry’s growth could act as the commercial capitalistic incentive that the US has failed to create to internalize the externalities of climate change. That’s far from a guarantee; it would still require smart policy-making, but it’s worth trying. As far as climate goals go, AI might take us two steps back, but it could also help us take one step forward.
And even if that’s a net negative, if the two steps back are inevitable, you might as well focus on how to take that one step forward.
Okay, so this is probably not your first time watching a Nebula ad read and it probably won’t be your last time—unless of course you sign up for Nebula. But I digress. To keep it interesting, I’m just going to give you three tips on what to watch on Nebula.
The first one is the kind of thing where I can only say so much without ruining the frankly incredible twist. So, I’m just going to show these reactions from Reddit and hope you trust them. These are the reactions to a new doc by Bobby Broccoli called 17 Pages About One of the Largest Scientific Controversies of the 20th Century. And trust me, the doc is excellent and mind-blowing.
I truly can't say too much, but it's definitely the kind of story Wendover viewers will enjoy told excellently. Next, I’m going to suggest a new Nebula original video by Lindsay Ellis. You probably remember Lindsay Ellis, who was one of YouTube’s most prolific video essayists but stepped away from the platform and went Nebula only.
I personally loved her recent release about when Disney rebrands attractions but does it terribly. It’s such a niche topic, but for some reason, I find it so entertaining. Lastly, I’m going to shout out a show that we at Wendover make called Abolish Everything—it’s a comedy debate show filmed in front of a live audience in New York City.
The reason I’m going to shout it out is because we just released what we all seem to agree is the best episode—it’s the sixth one. Just trust me, it’s great if you want to give the show a shot; that’s the one to watch. Nebula's filled with all sorts of great exclusive content made by digital creators.
I think it’s got the perfect blend of innovation that you don’t get from the major streamers with the polish and scale you don’t get from YouTube. And its cost is perfectly in between. It’s not free, but that means it’s not ad-supported. It’s entirely ad and sponsorship-free, giving a better viewing experience for you and better, more predictable economics for us creators.
But it also costs far less than the major streamers, especially when you go to Nebula TV Wendover. When you sign up there, you’ll get 40% off an annual plan, which brings the per month cost down to just $3. I think that’s a steal for the quality of exclusive content you get, combined with all our normal videos ad-free.
What hopefully seals the deal is that it genuinely is the best way to support us and so many other independent creators. So head over to Nebula TV Wendover to sign up with 40% off and thanks if you do.