Nvidia is partnering with German semiconductor company Infineon to work on a centralized power delivery system. This approach removes the requirement for conventional rack-level conversions, thereby reducing both copper consumption and excessive heat production.
As AI infrastructure keeps growing, data centers are approaching capacities that were once deemed unattainable, with forecasts indicating that power requirements for certain configurations will surpass one megawatt within a few years. The conventional 54V DC standard is inadequate for these demands, which accounts for Nvidia’s shift towards more scalable and forward-thinking designs. In contrast to older systems, this new approach minimizes the number of components needed, streamlining infrastructure and improving reliability. Importantly, it also reduces material expenses and cooling needs, offering both environmental and financial benefits.
Nvidia has teamed up with Schneider Electric to refine the way AI data centers are deployed and cooled, scaling this project even further. Their joint reference designs, which were tested weeks ago, reportedly decrease cooling-related power usage by up to 20 percent and reduce setup times by nearly a third. These changes not only lower operational costs but also make it easier for companies to expand their infrastructure more sustainably. This collaboration speaks volumes about Nvidia’s broader strategy, not only building faster microchips, but also creating an ecosystem that supports long-term scalability.
The statistics supporting this transition are equally persuasive. In the final quarter of 2024, global sales of AI chips for data centers surged to $32.6 billion, with GPUs holding a significant portion of the market. Concurrently, Nvidia’s leadership continues to forecast an astonishing 300 percent increase in data center expenditures allocated to AI over the forthcoming three years. While artificial intelligence possesses the capability to revolutionize nearly every sector, the rising demand incurs a price—global data centers currently utilize approximately 2 percent of the world’s electricity, a number anticipated to double by the decade’s end due to AI-specific infrastructure. Rather than allowing this growth to become a burden, Nvidia is taking proactive measures.
In many ways, Nvidia’s announcement marks more than a product launch—it signals a transition in tech leadership from pure processing dominance to operational intelligence. By addressing both environmental concerns and business scalability in one move, the company is setting a high bar not just for its competitors, but for the entire industry, which could very well be the moment where efficiency becomes as valuable as power, and where thoughtful engineering begins to matter just as much as breakthrough performance.