Posted on

AI and Data Centers: Powering the Next Computing Revolution



Share

AI and Data Centers: Powering the Next Computing Revolution

As generative AI applications proliferate, data centers have become the linchpin of the global technology economy. Leading cloud providers and AI companies are racing to build vast new facilities to host AI training and inference workloads. This explosive growth is driving an unprecedented surge in data center capacity needs—and a corresponding demand for power and capital. Industry analysts project that global data center electricity demand could roughly double by 2030, reaching on the order of 945 terawatt-hours (TWh) annually. In the United States alone, data centers consumed about 183 TWh in 2024 (over 4% of U.S. power use), a figure expected to more than double by 2030. These trends underscore a simple fact: to sustain the AI revolution, companies and governments must dramatically scale up infrastructure for computing, power generation, cooling and networking. The investments needed run into the trillions. McKinsey and other analysts forecast roughly $6–7 trillion in global data center capital expenditures by 2030.

Skyrocketing Demand and Investment

A confluence of factors is driving data center build-out at breakneck speed. High-performance AI models demand vast numbers of specialized processors (GPUs and custom AI chips), each consuming far more power than traditional servers. For example, powerful GPUs now draw tens of kilowatts per rack, and next-generation AI clusters may need up to 600 kW each. As a result, AI workloads are rapidly increasing the power density of data center facilities. Goldman Sachs Research forecasts that global power demand from data centers will rise 50% by 2027 and roughly 165% by 2030 (versus 2023 levels). In absolute terms, worldwide data center power capacity is projected to grow from about 55 gigawatts (GW) today to roughly 84 GW by 2027.

This growth is being fueled by massive spending. The world’s hyperscale cloud companies (Amazon, Microsoft, Google, Meta, etc.) are pouring capital into new data centers and manufacturing. McKinsey estimates that data center infrastructure investment (excluding IT hardware) will exceed $1.7 trillion by 2030. Another projection puts total required data center CAPEX at nearly $7 trillion by 2030. Even in the short term, capital spending is surging: some reports expect hyperscaler investment in AI infrastructure to reach $371 billion in 2025 (a ~44% year-over-year jump). In parallel, utilities are ramping up their own capital programs: U.S. electric and gas companies plan to spend roughly $212 billion on transmission and generation in 2025, up 22% from 2024. These figures illustrate that both tech and power sectors recognize AI as a game-changing investment opportunity. As Deloitte notes, stakes have never been higher: power grids, factories and data centers are being built out at a “trillion-dollar” scale to meet AI-driven demand.

China and the U.S. account for the lion’s share of this expansion. The International Energy Agency (IEA) projects that together these two regions will contribute about 80% of the global increase in data center electricity use through 2030. In the U.S., data center operators are planning campuses requiring hundreds of megawatts to gigawatts of power each. One survey found the largest new U.S. data centers under construction may need up to 2,000 MW (2 GW) apiece – a scale comparable to the biggest power plants. In Northern Virginia (the country’s top data center market), about one-quarter of all power sales now go to data centers, and capacity is strained: Dominion Energy estimates that existing orders could more than double Virginia’s AI data center capacity by 2028, potentially reaching 10 GW by 2035.

Infrastructure and Energy Challenges

This breathtaking growth poses serious challenges for energy, real estate and regulation. Conventional power grids and older data center designs are being stretched to their limits. The average large AI-focused data center uses advanced servers with tens of thousands of GPUs, which consume 2–4 times more electricity per chip than traditional servers. Cooling those racks is also far more intensive: cooling and facility overhead can account for 20–30% of a data center’s power draw, and water usage is enormous (U.S. data centers consumed ~17 billion gallons of water in 2023). As one analysis notes, even a small 5-acre facility with AI workloads could draw 50 MW instead of 5 MW, necessitating infrastructure far beyond normal enterprise data centers.

Grid reliability and cost are immediate concerns. Data centers already consume over 25% of electricity in key regions (e.g. 26% in Virginia). In states like Virginia, North Dakota and Oregon data centers now account for double-digit shares of total supply. Without upgrades, this can lead to outages or blackouts. Utilities in affected markets have had to invest billions in new transmission and generation capacity to serve data centers, often with costs passed on to residential customers. A study of the PJM grid (mid-Atlantic) estimated that data center load drove a $9.3 billion increase in capacity market prices for 2025–26, translating to $16–18 higher monthly bills for households in parts of Maryland and Ohio. The challenge is compounded by the long lead times: while a data center can be built in 1–2 years, adding new power plants or transmission lines often takes much longer (gas plants under development may not come online until the late 2030s).

Location and siting are also issues. Data center developers cluster in areas with cheap power and fiber, but that creates local bottlenecks. Northern Virginia exemplifies this: space and power are scarce, pushing hyperscalers to look to secondary markets for expansion. Globally, similar pressures are emerging in Asia-Pacific and other regions. Beyond power, securing real estate (land zoned for heavy infrastructure) and water for cooling has become a competitive enterprise, as local communities vie for the economic benefits of AI infrastructure.

Technological Innovations in Hardware and Cooling

To cope with these challenges, the industry is innovating at multiple levels. Data center design is evolving to achieve higher efficiency and density. Leading operators are deploying liquid cooling and direct-to-chip systems to handle heat from high-power racks. For example, advanced liquid–air hybrid cooling or closed-loop immersion systems can dramatically reduce the energy required for cooling and allow much higher rack densities. Some new facilities even capture rainwater or use geothermal sources for cooling, potentially meeting a third or more of their cooling needs while conserving freshwater. Companies are also pioneering “energy-efficient AI chips” tailored for inference workloads, which can vastly cut power per computation compared to general-purpose GPUs (for instance, dataflow architectures can eliminate redundant memory accesses).

At the chip and systems level, innovations are underway. Leading chipmakers are exploring moving power delivery to the back of processors, using optical interconnects instead of copper, and stacking chips in 3D to slash losses. These techniques could reduce energy use per AI computation by orders of magnitude. Nvidia’s roadmap illustrates the trajectory: today’s high-end GPUs may draw ~120 kW per rack, but emerging designs (e.g. the “Vera Rubin” series) could reach 600 kW per rack with liquid cooling, while next-generation architectures promise even more efficiency. On the software side, hyperscalers are developing distilled and distributed AI training methods that can achieve the same model accuracy with fewer computations, somewhat taming growth in power demand. AI itself is also being deployed to optimize data center operations – for instance, machine learning algorithms can manage cooling, adjust server utilization in real time, predict equipment failures, and orchestrate hybrid cloud workloads to minimize waste. Such “AI for infrastructure” tools can shave 10–20% off operating costs by squeezing more performance from a given power budget.

On the grid side, both utilities and data center operators are experimenting with new models. Some proposals involve colocating data centers with generation and storage (“additive infrastructure”) so that the facilities can directly consume on-site renewables or even share transmission upgrades. For example, a massive coal plant conversion project in Pennsylvania will redevelop an old site into a gas plant and adjacent data center campus, effectively doubling capacity to meet AI demand. Another approach is advanced grid equipment: dynamic line ratings, solid-state transformers, and other grid-enhancing technologies can boost transmission capacity by 30–100% with far shorter build times than new lines. Regulators and industry groups are beginning to incentivize “smart” grid solutions to keep pace with data center loads, such as capacity markets and special rate structures.

Sustainability and Environmental Impact

The environmental footprint of this expansion is under intense scrutiny. Data centers already consume a notable slice of electricity: about 1.5% globally in 2024, rising to around 3% by 2030 even in a “moderate growth” scenario. While small in percentage terms, their concentrated loads can strain local grids and require substantial new generation. Critically, the source of that power matters for emissions. Today in the U.S., roughly 40% of data center electricity comes from natural gas, with coal, nuclear and renewables making up most of the rest. Several major tech companies have therefore committed to shifting their data centers to renewable or carbon-free power. For instance, state legislation in California, Minnesota, and New Jersey is already pressing data centers to procure a growing share of renewable energy, and some Virginia initiatives would mandate reporting of power and water use. Early actions include tech firms signing long-term green power contracts, and even reviving idle nuclear plants to meet demand.

Water use is another growing issue. Efficient hyperscale facilities can still consume billions of gallons annually for cooling (estimates for hyperscale centers reach 16–33 billion gallons per year by 2028). This has prompted legal scrutiny: utilities and regulators in several states are examining permitting rules and “water usage transparency” for data centers. (The National Caucus of Environmental Legislators reports that in 2025 alone, over 60 state bills have been introduced on data center energy, environmental impacts and transparency.) Data center developers are responding by adopting air cooling or non-water-based systems wherever possible, and by investing in water recycling.

Still, even with these efforts, the sector’s carbon footprint remains substantial. A recent analysis noted that a typical hyperscale AI server cluster uses as much electricity as 100,000 households. This has sparked debate among executives and the public about trade-offs: Pew Research finds Americans split on whether AI will ultimately benefit or harm the environment. Business leaders thus face a dual imperative: meet performance targets for AI while also hitting ESG goals. Sophisticated corporate buyers of data center services are increasingly demanding renewable energy commitments and efficiency metrics (like power-use effectiveness, PUE) as part of contracts. For example, some leading builders are designing new campuses with ultra-efficient infrastructure targeting PUEs near 1.1 (versus the industry average of 1.5–1.7).

Business Models and Market Winners

For investors and executives, these developments underscore that data centers – and related supply chains – represent a major new asset class. Industry analysts caution that while some valuations may be stretched, the underlying demand drivers are real and long-term. KKR observes that today’s AI-driven data center cycle is underpinned by decades-long contracts with big tech clients, making it different from past booms. The key to success will be creating durable “moats” around infrastructure: control of power and land, strong relationships with hyperscalers, and the operational excellence to deliver cutting-edge facilities on time and on spec. Conversely, projects that neglect to secure cheap power or fail to lock in long-term off-take deals risk being left behind in an oversupplied market.

Some specialized business models are emerging. For instance, providers of high-density co-location offer tailored environments for AI hardware, with features like on-demand liquid cooling, flexible power, and direct links to cloud networks. A few startups even tout “AI-ready” data center racks that combine GPUs and low-power AI chips in novel ways to maximize inference efficiency. Large investors are also backing “fused infrastructure” partnerships: joint ventures that build data center campuses alongside solar, wind, or nuclear plants to ensure clean power supply. One example is a planned $20 billion renewables-and-storage “energy park” with co-located data center load, slated for operation by 2026.

From a regional standpoint, competition is fierce. U.S. tech hubs like Virginia, Texas and California lead in data center count and activity, but secondary markets with abundant wind, solar and affordable land (Midwest, Nordic countries, Middle East) are attracting new investment. Governments see data centers as drivers of economic growth, and many have sweetened the deal. State and local incentives – expedited permitting, tax breaks, subsidized power – now routinely target data center projects. On the flip side, regulators are imposing new compliance requirements, such as mandating renewable usage or reporting energy/water metrics. Navigating this evolving regulatory landscape will be a critical leadership challenge.

Risks and Regulatory Landscape

Despite the opportunity, several risks loom. Infrastructure bottlenecks – especially in transmission and interconnection – could delay projects and inflate costs. Utilities are understandably cautious; their executives note that “competition for resources” is acute, and regulatory uncertainty can stall new plant or line approvals. Without proactive planning, data center expansion could exacerbate power price volatility for ordinary customers. State legislators have taken notice: at least 22 states have introduced legislation in 2025 to manage data center impacts on the grid. Proposed measures include ratepayer protections (e.g. barring data center-driven grid upgrade costs from being passed onto households), renewable energy requirements, and mandatory energy usage disclosures. Such rules reflect a broader mandate: ensure that explosive tech growth remains “harmonious with clean energy goals.”

There are also geopolitical factors. The semiconductor supply chain is under duress, and trade tensions between the U.S. and China could affect access to key components (chips, memory, networking gear). Some countries are already viewing AI data center capacity as a matter of national competitiveness and security. Firms must therefore diversify suppliers and work closely with governments on supply chain resilience. Additionally, public backlash is possible if communities feel the costs (in electricity or water) outweigh the jobs and innovation benefits. Data center companies will need to engage transparently with regulators and stakeholders to mitigate these risks.

From a legal standpoint, operators face emerging compliance issues. For example, California and New Jersey have considered laws requiring data centers to procure renewable energy and report usage statistics. Water rights and discharge regulations are also gaining attention (one law firm notes that water use by hyperscale facilities could trigger new permits and standards). While these policies are still nascent, CEOs would be wise to anticipate stricter environmental reporting and to invest in green certifications. (This article is informational and does not constitute legal or investment advice; companies should consult qualified advisors on regulatory compliance.)

Looking Ahead: Balancing Growth and Efficiency

The transformation under way in data center infrastructure is historic in scale. As AI and high-performance computing reshape industries, the backbone of our digital world – data centers and power grids – must evolve rapidly. On one hand, this presents enormous opportunity: companies that can deliver the massive power, connectivity and efficiency that AI demands stand to reap trillion-dollar markets. On the other hand, shortfalls in planning could create pinch points that slow technological progress.

Most experts agree that the long-term demand is real, even if short-term hype fluctuates. Goldman Sachs, Deloitte and others expect occupancy rates at modern data centers to remain high through at least the late 2020s, despite some market cycles. The keys for senior leadership will be strategic: locking in advanced technology for efficiency, forging partnerships across industries (real estate, energy, manufacturing), and working with regulators to ensure that infrastructure scaling remains viable. Successful players will blend aggressive growth with sustainability—adopting renewable power, cutting-edge cooling, and AI-driven optimization—to meet both business and climate imperatives.

In summary, the intertwining of AI and data centers marks a new inflection point in the digital economy. The scale of investment and innovation required is unprecedented. But with careful management – combining technical ingenuity, financial discipline, and regulatory savvy – it can be harnessed to drive continued economic growth, technological leadership, and even progress on environmental goals. The coming decade will show whether industry and policymakers rise to the challenge; so far, the signals point to a “powering of AI” that aims to be as smart as the technology itself.

Sources, References and Additional Reading

  • International Energy Agency (IEA) – analysis and outlooks on data centres, AI and global electricity demand.
  • McKinsey – research on global data center capital expenditure and AI infrastructure trends.
  • Goldman Sachs – research on data center power demand, AI-driven electricity consumption and infrastructure investment.
  • Deloitte – insights on AI, power grids and the industrial infrastructure build-out.
  • KKR – perspectives on data centers as an emerging infrastructure asset class.
  • National Caucus of Environmental Legislators – tracking of U.S. state-level data center energy and environmental legislation.
  • Pew Research – public opinion research on AI, technology and the environment.

Disclaimer: The information in this article is provided for general informational purposes only and does not constitute legal, regulatory, tax, investment, financial or other professional advice, and should not be relied upon as such. You should obtain independent advice from qualified professionals in the relevant jurisdiction(s) before making any decision or taking any action based on the content of this article. While reasonable efforts are made to ensure that the information is accurate and current, 1BusinessWorld makes no representations or warranties, express or implied, as to its completeness, reliability or suitability. To the fullest extent permitted by law, 1BusinessWorld and the author accept no liability for any loss or damage arising from the use of or reliance on this article. The views expressed are those of the author and do not necessarily reflect the views of 1BusinessWorld or its affiliates.