IT engineers in the control room of a data center run generative AI systems.

TE Perspective on Data Center Performance

A More Sustainable Data Center

Author: Mike Tryson, VP & CTO, Data & Devices

With the rise of generative artificial intelligence (AI), data centers have become an increasingly critical part of the world’s technological infrastructure — and a growing consumer of electricity.

 

Data centers accounted for an estimated 1.3% of global electricity demand in 2022 according to the IEA. By nearly all accounts, that number is expected to continue to grow rapidly as the amount of data and computation handled in the cloud continues to rise. That’s a challenge for a world in which overall global electricity growth is already expected to average more than 3% per year through 2026. The increasing competition for electrical power from growing consumer demand, expanding use of electric vehicles, increased heating and cooling due to changes in global climate, smart cities and the electrification of industrial enterprises mean data centers will need to become more efficient to realize their full potential.

 

The good news is that engineers are already creating innovations to increase data center sustainability. The growing volume of data and increasing speed at which that information moves means that even seemingly small-scale improvements in the cables, connectors and heat dissipation equipment used inside today’s data centers can have a huge impact on the amount of energy those facilities consume — especially when you consider that effect gets multiplied across millions — or even billions — of interconnects in a modern data center. 

Making Every Electron Count

All electricity must overcome resistive losses as it flows from one place to another. The farther you move that electricity, the more resistance you have to overcome — reducing the amount of useable electricity that reaches its final destination.

 

Furthermore the “wasted” electricity generates heat — which requires additional energy to cool the data center so it is operating within designed specifications. Multiply the amount of power lost over a single meter of cable by the number of meters of cable in a million-square-meter data center and you inevitably come up with a very large number. Reducing those resistance losses means more of the power data centers purchase ends up actually powering their equipment. Less wasted electricity means less total consumption overall.

 

Data centers have already successfully reduced resistance losses by transmitting power at higher voltage, reducing the current required per unit of power and thus the resistance generated while moving it. Much of the industry has transitioned to 48-volt distribution systems, and data centers may look to higher-voltage distribution systems in the future for additional reductions. Higher-gauge cables that can accommodate a greater amount of current can also reduce resistance losses.

 

Every connection between cables and devices represents another opportunity for resistance losses. Low-resistance busbars and connectors can help reduce those losses. Even if those reductions seem small on the scale of an individual connector, the overall impact across the large number of connections required by a large data center adds up quickly.

AI engineers programming data center racks to operate generative AI architectures.

Engineering the Future of Data Center Architectures

Moving Data More Efficiently

The ability to move data faster is critical for AI, data centers and the activities they support. The faster you move large volumes of data, however, the harder it becomes for connections at the ends of the cable to handle that capacity. Passive connections don’t use any additional power to move data from cable to device, but there’s a limit to the distance they can move data at any given speed while minimizing signal loss and degradation. Active connections can move more data farther and faster, but at the cost of significant power.

 

At TE Connectivity, we are working to improve cables and connectors to maximize the capabilities of passive interconnects. The longer cables can be before they require active interconnects to maintain signal integrity, the better they can serve cloud and AI device clusters. Solutions that use external cables or a combination of external and internal cables in place of circuit boards can further reduce power and data latency. Dialing in the right combinations of these factors might allow a data center to use more passive cabling while retaining higher speeds, reducing its energy consumption.

 

Connectors play an important strategic role in supporting future sustainability efforts, as well. The ability to modularize designs around standard connectors will allow data centers to upgrade their equipment more easily as innovations that produce faster, higher-bandwidth data transfer come along. By contrast, hard-wired connectorless solutions make it much more difficult to upgrade older devices in ways that improve their functional capability or efficiency. This lack of an easy upgrade path creates more waste and reduces a data center’s flexibility to take advantage of more efficient electronics or configurations as they become available. Connectors are a key element for supporting a more circular ecosystem in large data centers.

Reducing Heat

All devices in a data center generate heat. Increased operating temperatures decrease the reliability of products and components, which in turn increases the potential for downtime.

 

As speeds and distances exceed the limits of copper cabling, the move to optical cables makes heat an even greater concern. Optical components are much more sensitive to operating temperature than passive copper cables and connectors.

 

Just as electrons add up to sustainability gains when addressing resistance losses, every thermal unit counts when dealing with heat. Increased heat load due to “wasted” electricity forces data centers to lean even harder on fans and air conditioning systems to keep equipment cool — requiring even more energy. That means data centers must look closely at every part of their infrastructure to find opportunities to reduce heat and save energy.

 

Efficient thermal management solutions including heat sinks are an essential tool. In order to achieve maximum cooling efficiencies either by forced air or liquid cooling, you have to minimize the contact resistance between hot electrical components and the cooling surfaces — especially when those surfaces aren’t completely flat. Any air gaps between surfaces reduce the efficiency of heat transfer, even at the micro interface level. TE Connectivity’s conforming plate design for heat sink connections provides up to two times better heat transfer than common solutions such as gap pads and thermal greases. The closeness of these connections is important to sustainability because the heat generated by devices goes up almost exponentially as the power used in those devices increases.

Engineering Connectivity Solutions for AI Innovation

Artificial intelligence (AI) is beginning to change how people work, innovate, and interact. 

Engineering Connectivity Solutions for AI Innovation

Artificial intelligence (AI) is beginning to change how people work, innovate, and interact. 

Doing the Right Thing

Data center operators understand they need to address their power consumption to ensure the future of the industry, especially as AI technology continues to develop. TE is helping our customers meet this goal through our product innovations. We’re also helping because, as engineers, it’s in our DNA. Being part of the solution is part of being an engineer: 87% of engineers say that supporting climate change solutions is personally
important to them, according to the 2024 TE Connectivity Industrial Technology Index. Each incremental reduction in resistance or heat through data center innovations may be small on its own, but when deployed in large numbers, the impact multiplies. These efforts will be critical for sustaining the growth of data center technology and the industries and applications that rely on it. 

About the Author

Mike Tryson, VP & CTO, TE Data & Devices

Mike Tryson

Mike Tryson is VP and CTO of TE’s Data & Devices business. In this role, he leads the Data & Devices engineering team and strategy, partnering with customers around the world to develop their interconnect solutions and architectures. Mike joined TE in 2011, bringing with him 25 years of experience in technology leadership roles. He has a track record of successfully introducing technology innovations to the data communications market.