
On a clear afternoon drive through Ashburn, Virginia, on Route 7, you’ll come across a section of land that appears to be any other mid-Atlantic suburb from the road: parking lots, low-rise business buildings, and the odd warehouse. A closer look reveals the densest concentration of data centers on the planet, with about 300 facilities spread across a few counties in Loudoun County. Approximately two-thirds of all global internet traffic passes through these facilities at any given time. The buildings have hardly any markings. Signs are not necessary for them. They run nonstop, using resources from the regional electrical grid and the local water supply in amounts that the county’s infrastructure was not designed to handle, producing heat that rises from their rooftops in visible waves on chilly mornings. In fiscal year 2025, Loudoun County anticipates data center tax revenues to be close to $900 million—nearly the county’s entire operating budget from a single land use that permanently employs a few hundred people.
An increasing number of locations, including rural Louisiana, Wyoming, Texas, Ireland, Singapore, and the suburbs of large European cities, are starting to resemble Ashburn. Physical infrastructure has been necessary for the AI boom, and this infrastructure is massive, water-intensive, and power-hungry in ways that the public discourse on AI has been slow to fully address. Global data center electricity consumption was estimated to be around 415 terawatt-hours in 2024—roughly 1.5 percent of global electricity use—and is expected to more than double by 2030, reaching roughly 945 terawatt-hours, according to the International Energy Agency’s comprehensive 2025 report on energy and AI. Many in the industry had been reluctant to state this clearly. That is marginally more than Japan consumes annually. Roughly 45% of the electricity used in data centers worldwide comes from the United States alone. By the end of this decade, the nation is expected to use more electricity to run AI infrastructure than it does to produce aluminum, steel, cement, chemicals, and all other energy-intensive goods put together.
| Topic | Inside the Data Centers Powering the AI Revolution — And Their Climate Cost |
| Scale of Electricity Use | Global data center electricity consumption: ~415 TWh in 2024 (1.5% of world total); projected to more than double to ~945 TWh by 2030 — slightly more than Japan’s total electricity consumption. US data centers alone consumed 176 TWh in 2023, roughly equal to Ireland’s total national use. By 2030, the US will use more electricity for data centers than for all energy-intensive industries combined (IEA) |
| Power Density Contrast | Traditional data centers: 5–10 kW per rack. AI-optimized facilities: 60+ kW per rack. A typical AI-focused data center uses as much electricity as 100,000 households. Largest hyperscale facilities under development will use 20x that. Meta’s Hyperion (Louisiana) will draw more than twice New Orleans’ total power consumption |
| Carbon Emissions | Data center emissions are projected to grow from 220 million tonnes (Mt) in 2024 to 300–320 Mt by 2035 (IEA). AI systems alone could produce 32.6–79.7 million tonnes of CO2 in 2025 (ScienceDirect). Processing 1 million tokens emits carbon equivalent to driving a gas car 5–20 miles (MIT Lincoln Lab) |
| Water Consumption | Average AI data center: 550,000 gallons of water daily for cooling. Large hyperscale facilities: up to 5 million gallons per day — equivalent to a city of 50,000 people. Cooling can account for 40% of a data center’s total electricity use. Liquid cooling systems are increasingly being deployed for high-density AI chips |
| Reference | IEA — Energy and AI Special Report (iea.org) |
The IEA specifically makes the comparison to traditional industries, and it’s worth pondering for a while. According to the IEA, an AI-focused data center can consume as much electricity as a sizable aluminum smelter. However, the smelter is dispersed throughout an industrial site, and its supply chain and workforce are dispersed throughout a regional economy. The data center is a windowless, concrete-encased structure that employs between 100 and 150 people. It is located on land that could otherwise be used for housing, farming, or the outdoors, and it draws power from a grid that was built for a different era of demand. When operational, Meta’s Hyperion data center—which is presently under construction in Louisiana—is anticipated to consume more than twice as much electricity as the entire city of New Orleans. The proposed metadata center in Wyoming will consume more electricity than all of the state’s residences put together. These figures are committed construction projects, not estimates.
In certain regions, the water dimension is perhaps more immediately concerning and receives less attention. Approximately 550,000 gallons of water are used daily for cooling in an average AI-optimized data center. Bigger hyperscale facilities may need five million gallons per day, which is the same amount of water used by a 50,000-person city. The industry is shifting toward liquid cooling systems for high-density AI chips because cooling can account for 40% of a data center’s total electricity consumption. Additionally, “closed-loop” systems that significantly reduce water consumption are an active area of engineering investment. However, the implementation of these technologies is uneven, and many planned and existing facilities in water-stressed areas continue to draw from nearby aquifers and watersheds at rates that are taxing local supplies without the majority of locals being aware of the strain.
The sincere efforts—and occasionally sincere investments—that the big tech companies are making in the direction of renewable energy complicate the carbon picture. In addition to supporting wind, solar, and long-term power purchase agreements on a significant scale, Google, Microsoft, Meta, and Amazon have all pledged to run their businesses on clean energy. However, current reality and commitment are not the same. According to the IEA’s analysis, natural gas is increasing by 175 terawatt-hours to meet the demand, particularly in the United States, even though renewable energy sources will account for half of the growth in global data center demand through 2035. Due in large part to AI’s energy requirements, the production of new gas power has doubled in the last year. Because the grid cannot afford to lose the generation capacity, coal plant retirements that were planned based on pre-AI demand forecasts are being postponed in some areas. This is challenging because both the clean energy and fossil fuel narratives are true at the same time.
It’s difficult to ignore the discrepancy between the public discussion of AI as a tool for improving energy systems, speeding up scientific research, and decreasing inefficiency, and the actual physical implementation of this technology. Vijay Gadepally of MIT, who oversees the research projects at the MIT Lincoln Laboratory Supercomputing Center, has been working on this directly and in a very practical way. He has been developing software tools that shift AI workloads to lower-carbon time windows, power-capping server processors to 60–80% capacity, and reconsidering which models actually need to be fully trained. If widely implemented, he estimates that these interventions could reduce the world’s data center electricity demand by 10 to 20 percent. They don’t need significant capital investment. They call for a shift in default behavior, which is an infrastructure issue in and of itself, as anyone who has attempted to alter institutional defaults is aware.
Local governments and utility regulators are increasingly being asked to respond to the question of who bears the expenses that data centers do not cover. In the impacted US regions, the rate of increase in electricity prices is more than twice that of inflation. In Northern Virginia, thousands of on-site diesel generators, each the size of a railcar, operate for extended periods of time, causing air quality problems that disproportionately affect locals who didn’t choose to live close to a data center campus and who aren’t employed by one. Stronger federal regulation of data center energy and water use has been demanded by more than 230 environmental organizations. In order to address how to map, measure, and mitigate what is now acknowledged as a national infrastructure challenge, the National Academies of Sciences organized a workshop in 2025. This does not imply that the growth will cease. It may indicate that it’s time to be more forthright about the costs associated with growth and more thoughtful about who is responsible for those costs.
