Using AI to Enhance AI Infrastructure Systems and Ease Stranded Power

AI is driving record power demand in data centers, with rack densities hitting 150 kW. C&D Technologies helps operators tackle cooling and grid strain while unlocking energy savings and sustainability gains.
Jan. 26, 2026
12 min read

Key Highlights:

  • Data center power consumption is rapidly increasing, driven by AI, high-performance computing and cloud services, necessitating new design and retrofitting strategies.
  • Legislation and incentives, such as the U.S. IRA and EU directives, are accelerating data center construction and modernization, especially in rural areas with abundant renewable resources.
  • AI can be used to optimize workload scheduling, manage power fluctuations, and improve energy efficiency through analytics and integrated energy storage systems.

Power is a force we tap, store and release to maintain uptime in our data centers. Not all data centers are created equal. We have seen power demands increase in our data centers organically, followed by another boost from high-performance computing and again for artificial intelligence (AI).

Increased power demand changes data center designs and retrofits. Power has become the prime concern for both. In fact, the U.S. Department of Energy (DOE) estimates that the roughly 4% of overall data center power usage will climb to between 6.7% and 13% of overall power consumption in the U.S. 

Where power increases, so does cooling demand. Rising primary power and cooling translate to additional needs across backup generation components. Outside of disaster events, backup components are predominantly ignored as a primary generation source. New advancements in power and cooling are poised to change designs and operations moving forward.

Currently, data centers consume about 27 GW per year of total power across known and estimated footprints. As of the first half of 2025, more than 5,200 megawatts (MW) of data center power capacity are under construction in the U.S. This figure reflects ongoing building activity for hyperscale and multi-tenant data centers. The growth is largely attributed to cloud computing, AI and high-performance computing, although these applications do not represent all new construction. New legislation may accelerate developments through 2030.

Businesses can now deduct the full cost of qualifying equipment and short-lived capital assets (such as servers, cooling systems and power infrastructure) in the year they are placed in service, thereby lowering the after-tax cost of expensive hardware. 

Further, data centers can fully expense their production space. Opportunity zone enhancements favor some rural areas that would otherwise seem unattractive, including places like natural gas production pads. In fact, with power as a primary factor for new construction, some rural areas may prove beneficial due to a lack of competition for power resources and land for renewables. Adding regulatory and permitting streamlining rounds out the incentives for AI data center construction. Many of these incentives phase out in 2030, creating a near-term construction boon.

Data Center Power Requirements

Data centers are designed and constructed with the expectation of a particular power draw. The redundant (backup) power components are sized to carry the full load should the main power feed go down. Sizing redundancy is trickier with fluctuating loads and massive temporary power spikes.

AI poses a design challenge as loads vary significantly from training operations to image generation. Power densities of up to 140kW per cabinet are likely with the latest generative AI processors. As such, AI is a disruptor to power equations and, consequently, cooling calculations. Similar to most computing loads, AI fluctuates. Unlike other computer systems, the fluctuations are staggeringly significant.

Across the board, data center power demands are rapidly growing. Selecting power components is not just a “pick one and done” decision. Initial capital outlay is impacted by laws, incentives, and environmental, social and governance (ESG) goals. Local codes and requirements may dictate the use of one efficiency scheme at one site and something completely different at another. Efficiency goals, coupled with regulation, preference and recyclability, must all be considered.

Battery Considerations

Batteries are a good example. Battery costs, room and environmental requirements, functionality, lifecycle and supporting battery energy storage systems (BESS) costs factor into procurement decisions. While there are many battery chemistries available, the two predominant constructions for data center uninterruptible power supply (UPS) systems are lead-acid and lithium-ion.

Lead-acid batteries are the most mature and, as such, also are the most recyclable construction available. Global and in-country legislation and regulations may impact decisions. The U.S. Inflation Reduction Act (IRA, 2022), One Big Beautiful Bill Act (OBBBA, 2025), EU Clean Industrial Deal (CID, 2025) and supporting frameworks like the EU Batteries Regulation, Net-Zero Industry Act (NZIA), US EPA Clean Air Act, DOE Data Center Standards and state EPR laws provide legally binding and best practice guidelines for usage, storage and recycling.

Further to the laws and frameworks, ESG targets work to aid in compliance and provide guidance for environmental stewardship. ESG guidelines address the initial battery purchase and battery afterlife, in particular, recyclability. It is the latter where Lead Acid chemistries shine. Longer lifecycle batteries, like Pure Lead Max (Figure 1), offer extended implementation life and a nearly 99% recyclability.

Currently, their lithium counterparts have a problematic and costly recycling path. Recycling operations are being built to reuse materials, relieving stress on mining operations. Because lithium recycling is not yet mainstream, stepped-down capacity, second-life usage becomes part of the battery’s lifecycle in lieu of recycling. In short, there are many factors to consider.

A common thread across all updated legislative actions, ESG goals, data center designs and future planning for critical infrastructure, including power grids and cooling, is the impact of AI. Figure 2 shows the additional power needed to complete each work effort. Compared to a traditional web search, AI uses significantly more power per task. As reliance upon AI grows, power demands will reflect that increase.

AI is such a disruptor that many ESG targets have been either abandoned or drastically adjusted to accommodate the differences in demand and day two concerns. According to Forbes, data center infrastructure and operating costs are set to increase by $76 billion USD by 2028. U.S. data center power demand is forecasted to more than double from ~35 gigawatts in 2024 to 78 gigawatts by 2035 (BloombergNEF).

Deloitte estimates AI data center power demand in the U.S. could surge more than 30-fold, reaching 123 gigawatts by 2035, compared to 4 gigawatts in 2024. Globally, data center electricity consumption is expected to more than double from just over 415 terawatt-hours (TWh) in 2024 to an expected 945 TWh by 2030 (IEA).

Tasks are becoming more efficient, in part, due to improvements in processor efficiency, but even these average numbers vary greatly. Real energy and latency (response times) vary with model size, prompt length, batch operations/split operations, hardware, data-center efficiency and whether the inference runs on specialized accelerators or graphical processors.

Training/fine-tuning numbers are approximate and intended to illustrate differences in power. More processing operations require exponentially more power. The larger the data set, the more power is needed to comb through and assimilate it.

AI is such a disruptor that many ESG targets have been either abandoned or drastically adjusted to accommodate the differences in demand and day two concerns.

The byproduct of greater power consumption is increased cooling required to reject the additional heat generated. It goes without saying that more cooling necessitates more power. New liquid cooling methods increase cooling efficiency. While helpful, cooling remains a significant portion of the overall data center operations budget.

Stranded Power

The industry is redefining data center power and cooling in real time to address the hungry demands of AI. Cooling methods like immersion, direct-to-chip, rear door heat exchangers and new materials that don’t propagate heat are all parts of the solution.

To keep computing, storage, networking and cooling systems up and running, data centers rely on backup power systems and, in many cases, tertiary and quaternary power delivery paths. Balancing power demand, backup power and uptime are sometimes at odds with power efficiency measures. Stranded power is one such example.

Utility-stranded power is generated but remains unused and not stored. Inside a data center, stranded power is allocated but remains unused. Managing stranded power is difficult at best, as power demand fluctuates with processing requests. Harnessing stranded power leads to efficiency gains.

When utilities generate power, the power delivered to customers decreases due to transmission losses that occur as the power attenuates along lines. When the transmission distance decreases, equipment and line requirements decrease, as well. AI dedicated facilities are looking at on-site power generation for construction to alleviate some of these losses.

Microsoft recently purchased Three Mile Island. Natural gas sites are increasingly in demand. Geothermal locations are similarly popular for AI site construction. Small modular reactors and fuel cells are also viable power options. On-site power operates in island mode or is connected to grid resources.

Once generated, power must be either consumed or stored. Battery systems and thermal storage systems accomplish this task. Battery systems are not always considered a power source outside of test or failover.

Within a data center, battery storage is a critical component for maximizing uptime, dealing with power fluctuations, peak shaving and energy injection schemes. Industry estimates state that roughly 30-65% of data center power is stranded.

Overallocation is one culprit. When data centers begin capacity planning, planners allocate power per cabinet across the facility’s cabinet footprint. Colocation facilities contract portions of their overall site power for tenant use within their cages/data halls. When contracted allocations differ from consumption, the remaining power is similarly stranded.

Overprovisioning for redundancy is another offender. For example, two mirrored UPS systems, each capable of supporting the full IT load, are installed, but only one runs at a time. This leaves the other’s capacity stranded under regular operation. Uneven rack and row loading is another contributing factor. In this scenario, sites may allocate 10 kW per cabinet but less is consumed. One cabinet may draw 3 kW, one 7 kW and one 9 kW.

Underpopulated data halls can further lead to stranding as electrical infrastructure sits idle until the space is fully occupied. Providers and tenants alike struggle to find the right balance of contracted power versus usage. Decommissioning equipment as loads shift to the cloud or are retired contributes to vacancies. Contracted power does not always reflect those changes. Although colocation providers are working with tenants to reclaim as much as possible, challenges remain.

Another source of stranded power is cooling capacity limitations. Sometimes, the thermal limits (cooling capacity) within the room prohibit bringing new systems online despite available power. Heat within any space must be adequately rejected, or it will simply shut down due to thermal protections. 

Stranded power due to cooling limitations is driving the adoption of liquid cooling strategies. Liquid cooling boosts heat rejection exponentially. In fact, many AI applications cannot function without liquid cooling. Areas with water restrictions will use refrigerants, but they will use liquid, nonetheless. Some facilities will use a mixture of cooling methods.

AI in 2026 and Beyond

Perhaps the most significant benefit of an AI facility is the ability to use that same AI for the betterment of the physical AI facility. Power is difficult to manage on a granular basis. There is a direct correlation between the power consumed and the tasks performed, as shown in Figure 2.

Perhaps the most difficult tasks in power management involve managing the fluctuations in processing and the unpredictability of workloads. However, machines, in conjunction with AI, can analyze computing cycles 24 hours a day, in detail. In conjunction with multiple other factors, the results contribute to helpful analytics.

For instance, if we know that we have multiple generative AI tasks, we can ask systems to schedule them during periods of lower demand from other systems. Or, a decision may be made to orchestrate workloads to other facilities with more capacity or lower power costs as part of a multi-site plan.

We could use AI to break down the workloads into smaller, manageable chunks for processing throughout a period of time, as well. Plans may dictate processing various steps of a query at multiple locations, with final results moving to a central site for processing into further decisions. The possibilities are limitless.

When we couple machine intelligence with integrated energy storage systems (IESS), further possibilities exist. Batteries and intelligent energy storage are becoming great partners in the efficiency race. Peak shaving and load balancing are two such possible enhancements. Grid stresses trigger demand fees to consumers. The fees are in addition to consumption-based metering and sometimes represent 30-70% of the overall power bill.

Sometimes, demand charges reflect grid stress. Others have a ratchet clause allowing utilities to charge demand fees based on the highest usage during a year-long period. Using IESSs allow batteries to inject power during demand response times.  

Further to power injection, IESSs work within the battery systems, enhancing sustainability efforts. Coupled with weather data, a facility can, for instance, take advantage of rapid recharge on sunny days. Selecting advantageous periods for battery discharge and recharge benefits the overall efficiency schemes while using the battery systems to their fullest potential. In fact, governments around the world recognize batteries as a key factor in taming data center power.

The U.S. DOE promotes IESS usage through programs such as the Industrial Energy Storage Systems Prize and the Infrastructure Investment and Jobs Act (IIJA, P.L. 117-598), offering grants up to $200,000 for small businesses to deploy BESS systems for peak shaving. In the EU, Energy Efficiency Directive (EED) encourages data centers over 500 kW to use BESS for demand flexibility while aligning with their 75% renewable energy targets by 2030.

Other enhancements occur through the incorporation of additional data center systems and applications like Data Center Infrastructure Management (DCIM) systems. When a facility can tie cooling into the overall ecosystem management, the savings multiply.

Processing decisions, where possible, can be made not on the need for processing, but rather on the best time to process them for non-critical loads. Servers can communicate with counterparts and scale back, as necessary. For AI, that may mean training periods with the highest power consumption could be staggered so they do not coincide with other computing workloads. Once trained, the power requirements decrease for regular usage. Incremental training periods are scheduled similarly.

As sites mirror other sites, additional possibilities arise. Being able to select the most efficient site for bursty workloads adds to overall efficiency. Coordination with utility providers can enhance services, too. In some cases, utilities don’t assess demand fees if users can shed some load. 

Backup generation sources, including batteries, may cover all or some of the peak demand until grid stresses are relieved. As these backup generation sources are given more responsibility, battery selection becomes all that more important. Using a reliable battery that is field proven from an experienced manufacturer helps performance. 

AI-Driven Infrastructure: Maximizing Power, Minimizing Waste

Through facility design, high-density workloads and AI are gaining popularity. Liquid cooling facilitates the use of higher-power chips needed to process and generate AI and data workloads at a fraction of previous speeds. Using that speed and intelligence to enhance the infrastructure systems that keep them running is a win-win. 

By staggering workloads and taking advantage of AI to optimize our workloads, data centers today are at the precipice of alleviating many of the power challenges that nag us today.

About the Author

Carrie Goetz

Carrie Goetz

Chief Technology Officer, C&D Technologies

Carrie Goetz is chief technology officer of C&D Technologies, a provider of energy storage and backup power.

Sign up for our eNewsletters
Get the latest news and updates

Voice Your Opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!