Principled Technologies report: In-depth research. Real-world value.

Dell PowerEdge server cooling: Choose the cooling options that match the needs of you and your workloads

The Dell server portfolio powered by AMD EPYC enables multiple, flexible cooling options to suit your business-critical workloads, including AI

As compute-intensive activities such as artificial intelligence (AI) workloads have continued to grow in prominence worldwide, so have data center power requirements. While server energy consumption has grown steadily, the power required to cool those servers to keep them from overheating and failing is also a big piece of the power cost puzzle.1 According to one McKinsey analysis, “Cooling accounts for some 40 percent of a data center’s energy consumption,” and “efficient cooling is therefore a crucial driver of a data center’s profitability.”2 While no single cooling solution is right for every situation, air cooling, liquid cooling, or a combination of innovative cooling technology all offer unique advantages.

Dell Technologies offers a wide range of cooling technologies to suit the needs of different organizations and workloads. By choosing new Dell PowerEdge servers powered by AMD EPYC processors, you also access unique innovations in air cooling and liquid cooling, with the option to configure the right cooling solution for your specific needs. In this paper, we review publicly available data to discuss the Dell portfolio of server cooling technologies, including rack and multi-rack cooling, and discuss various factors you might consider when crafting your approach to cooling.

Prioritize your needs with a range of cooling options for Dell PowerEdge servers

Cooling is a critical concern for businesses sustaining their own data centers. For perspective, let’s look at the power requirements of a single rack and the equivalent cooling it needs. Until recently, one fully populated rack, configured without power supply redundancy, might draw up to 20 kilowatts (kW) of power in a real-world data center environment.3 Past that density, air cooling can struggle to keep up. For reference, a typical US household requires 24 or 48 kW (100 or 200A at 240V) maximum feed.4 That means a trio of populated racks drawing 20 kW of power might draw as much as or more power than an entire household, not including the additional power draw that cooling creates.

But with AI driving dramatically higher power needs, we’re now moving far beyond that level of power consumption. While not all AI tasks consume high amounts of energy, tasks like generating a single image can require as much energy as charging a phone—energy consumption that stacks with each user.5 Morgan Stanley estimates that “by 2025, generative AI’s power demands alone could equal a third of what data centers needed for all of 2022.”6 A Data Center Dynamics article notes that for racks packed full of AI servers and powerful GPUs, power needs are likely to increase to between 40 and 60 kW—double or triple what we used to consider a rack’s power needs.7 At these levels of energy consumption, to maintain high density, new cooling solutions are required. Direct liquid cooling is one such solution. Significantly more energy-efficient than air cooling, direct liquid cooling can translate to a smaller datacenter footprint, lower energy costs, less data center noise, and better sustainability.8

Dell is ready. They are currently developing liquid-cooled solutions that will support greater than 400kW in a single rack, a 20x increase from the old 20 kW standard in the same data center footprint.9 To give just one example, by utilizing direct liquid cooling, the new Dell Integrated Rack 7000 Series would support up to 480kW per 21-inch ORv3 rack. With the potential to include up to 72 nodes of AMD EPYC 5th Gen processors per rack, Michael Dell notes that “the system is engineered to capture nearly 100% of the heat produced within each rack.”10 With the dramatic increase in power and heat production from servers running heavy workloads, innovative cooling techniques will be essential for high-density/high-power deployments.

In this increasingly expensive space, administrators must make three overall choices when strategizing their approach to cooling: how to design the physical space of the data center, what systems to invest in solely for cooling (e.g., specialized in-row coolers), and what cooling technologies to choose inside the servers or racks themselves. While these decisions are interconnected, in this analysis we address the latter two questions, with a particular focus on the cooling technologies inside servers.

When deploying servers, you can choose air cooling, liquid cooling, or a combination of innovative cooling technology. Dell offers a wide range of server cooling technologies both inside its PowerEdge servers and at the rack and multi-rack level. Below, we dive into key options, arming you with the data and context you need to make informed decisions about the right cooling approaches for your workloads.

Air cooling

As you might expect, air cooling systems use air to cool servers and server components via fans, heat sinks, and other airflow management techniques. In a direct-air-cooled system, the server rack draws in ambient air (generally between 64 and 81 degrees Fahrenheit11) from the front of the chassis; passes it through the entire system, during which it grows hotter due to the heat generated by the components in the system; and expels it from the rear of the chassis, transferring the heat to the environment behind the server rack. Data center designers can enhance the efficacy and efficiency of this approach with hot and cold aisle containment, which separates areas of warmer and cooler air and thus prevents the recycling of hot air into the server.

Some data centers that use air cooling pull in outside air to improve efficiency, particularly during colder months, while the traditional approach uses heat exchange systems that filter and condition the air to cool it. Newer design strategies such as variable-speed fans and precision airflow management systems can adjust airflow based on temperature and other factors.

Dell air cooling technologies

All mainstream Dell PowerEdge servers offer air cooling by default, though many servers incorporate different innovations in air cooling technologies. According to a Dell blog focused on PowerEdge servers with AMD processors, such as the PowerEdge R7625 with AMD EPYC 9754 CPUs, “Our unique Smart Cooling technology uses computational fluid dynamics to discover the optimal airflow configurations for our PowerEdge servers.”12 Even very powerful Dell servers utilize air cooling: The Dell PowerEdge R7725, with two AMD EPYC 9005 series processors, is supported in an air-cooled configuration, while the PowerEdge XE9680, available with eight AMD Instinct MI300X GPUs, has a 5U footprint with air cooling as the default.13 PowerEdge servers also incorporate T-shaped motherboards, multi-vector cooling, thermal and power sensors, and airflow control algorithms to optimize air cooling.14 Dell notes that its fans and heatsinks “are qualified using mandated and extensive reliability and qualification processes to run at full speed for the life cycle of the server[,] minimizing costly downtime.”15 In one example, the modular AMD EPYC processor-powered Dell PowerEdge C6615 features an optimized thermal solution that allows for air-cooling configurations with up to 53% improved cooling performance compared to the previous generation chassis.16

For data centers that use hot/cold aisle configurations, some PowerEdge servers offer front input/output and power, allowing IT teams to access the servers from the cold aisle.17 This convenience enables IT to service equipment without the discomfort of standing in the full heat of thermal exhaust.

Smart Flow configurations

Some new Dell PowerEdge servers use Smart Flow configurations, which incorporate an airflow channel through the center of the server, where a middle storage slot would normally be.19 These servers, which also have redesigned backplanes to allow for more air intake, enable the use of more powerful CPUs and GPUs without liquid cooling. According to Dell, a PowerEdge R7625 server supporting high-powered processors like the AMD EPYC 9684x can achieve an approximately 17 percent higher rate of airflow, in cubic feet per minute (CFM), than a traditional 24x2.5-inch GPU chassis.20 Figure 1 illustrates a standard PowerEdge server configuration compared to a Smart Flow configuration. According to Dell, Dell Smart Flow “reduces fan power by up to 52% compared to previous generation servers.”21

Image showing lines representing temperature to map the flow of heat in a regular PowerEdge server and one with Dell Smart Flow.
A standard Dell PowerEdge server configuration (left) and a Dell Smart Flow configuration (right). The lines represent temperature, with blue indicating cooler temperatures and red being hotter temperatures. The Smart Flow configuration, on the right, efficiently takes in the cool air from outside the server and creates cooler air around the processors, which leads to cooler exhaust at the back of the server and less heat throughout the server. This configuration offers significantly better thermal efficiency, meaning lower cooling costs and less work for the cooling systems in the data center at large to handle. Source: Dell, https://www.dell.com/en-us/blog/save-money-with-innovative-air-cooling-for-your-servers/.

Liquid cooling ecosystem

Liquid cooling systems use either water or specialized coolants to keep server components cool. Cooling systems that incorporate both air and liquid cooling elements give customers control over the option that works best for their specific situations.

Dell embraces cooling solutions that make use of both direct liquid cooling and air cooling on different internal server components. In the Dell cooling ecosystem, direct liquid cooling is generally reserved for CPUs and GPUs via cold plates, while air cooling systems cool other server components such as storage drives, memory, and networking hardware. This reduces the complexity and maintenance requirements of the liquid cooling system and its internal fluid piping and cold plates. Dell internal modeling indicates that using direct liquid cooling (DLC) for CPUs and air cooling for everything else “can use just 3% to 4% more energy in cooling” than a fully direct-liquid-cooled solution, which they call a “cold plate everything approach,” while reducing cooling system design complexity and streamlining maintenance processes.22 This approach allows companies with less computationally intensive workloads to access many of the benefits of liquid cooling without the high cost of a fully direct-liquid-cooled solution.

Because liquids conduct thermal energy more effectively than air and can retain more energy by volume, they have the potential to remove heat from a system more effectively and enable greater density in data center deployments. Liquid cooling allows for more uniform distribution of cooling across hardware and eliminates the hot spots and other airflow control concerns associated with air-cooled data centers.

Two common liquid cooling methods are direct-to-chip cooling and immersion cooling. Direct-to-chip systems circulate liquid through cold plates attached to heat-generating components such as CPUs or GPUs, thereby transferring the heat to a separate component that cools it. Immersion cooling systems submerge server components in a non-conductive dielectric fluid that directly absorbs heat. In this paper, when we discuss liquid cooling, we will generally focus on direct-to-chip cooling. Dell offers immersion cooling solutions via Dell Technologies OEM Solutions.

Dell liquid cooling technologies

The Dell liquid cooling ecosystem uses direct-to-chip cold plates in conjunction with coolant distribution units (CDUs) to manage the coolant loop. In-rack CDUs provide enough flow to cool an entire rack, while in-row CDUs can support the coolant needs of multiple racks. Via the primary flow network, CDUs receive cool water from the facility’s water supply that controls coolant temperature via an internal heat exchanger. Then, the secondary flow network automatically distributes coolant in the appropriate volume and temperature to the coolant manifolds on the racks they support between the CDU and rack manifolds. They then use internal heat exchangers to cool the warm return coolant from the secondary flow network. CDUs also ensure that the coolant flowing to the rack coolant manifolds remains above the dew point temperature of the environmental air, to prevent condensation from forming inside the hardware.23

Dell also offers rear-door heat exchangers (RDHx) as an alternative to or in conjunction with servers enabled with DLC. A radiator in the RDHx uses facility water to cool the hot air that the server expels and then releases the cooled air to the environment. These heat exchangers fit on the back of the server rack, while their CDUs can work with either a single rack or a row of racks.24 RDHx provides customers a scalable and cost-effective option to integrate liquid cooling into their infrastructure without the disruption associated with retrofitting individual servers with direct-to-chip liquid cooling.

Cooling options by server model

Table 1 shows which 16th Generation and 17th Generation AMD EPYC processor-powered Dell PowerEdge server platforms support direct liquid cooling. (While some Dell solutions using these systems require a specific type of cooling, all of these AMD processor-powered PowerEdge models support air cooling.) We collected this data from the publicly available specification sheets for each server model. Unless we note otherwise, all models that support DLC are rack solutions that require rack manifolds and a CDU (in-rack or in-row) to operate.

Table 1: 16th Generation and 17th Generation AMD EPYC processor-powered Dell PowerEdge server platforms support direct liquid cooling. Source: Publicly available Dell specification sheets.
Model Not available (air cooled only) Optional DLC configuration available
R6615 X
R6715 X
R6625 X
R6725 X
R7615 X
R7715 X
R7625 X
R7725 X
XE7745 X
M7725 X
C6615 X

Comparing server cooling methods

Determining the right cooling method for your situation comes down to several key factors, including data center footprint limitations, existing/available cooling infrastructure, site power limitations, environmental cooling options, and environmental commitments.

At a high level, if you have a large data center with significant available space and CPU-only workload needs, traditional air cooling is the simplest solution to engineer. More constrained for space or scaling up your GPU-heavy AI workloads? Use liquid cooling to manage the high-performance infrastructure in your data center. Need the ultimate in high-density, high-performance? Utilize immersion cooling—available via Dell Technologies OEM Solutions—to keep every component chilled to optimum temperatures.

Even the choice of individual components within servers can impact the decision of how you cool your servers. As just one example, consider CPUs. The cooling needs of a server using a lower thermal design power (TDP) processor, such as the 24-core AMD EPYC 9224 at 200W, are different than a sever using a higher TDP processor, like the 192-core AMD EPYC 9965 at 500W. Or how about GPUs? If you’re building multiple GPU-heavy racks for intensive AI workloads, your cooling needs will be dramatically greater than they would be for a rack of CPU-only servers running everyday database workloads.

In data center engineering, one size rarely fits all. You must evaluate these technologies—and figure out which is right for you—by balancing efficiency, scalability, reliability, cost effectiveness, thermal management, energy consumption, and environmental concerns.

Efficiency

Liquid cooling is typically more energy-efficient than air-cooled infrastructure. One scientific review of liquid cooling calls direct liquid cooling “a transformative technology for enhancing energy efficiency and operational safety in high-density computing environments” with “potential to substantially lower thermal resistances and improve energy utilization.”27 Dell claims that using direct liquid cooling instead of traditional air cooling via perimeter air handlers can save 11 percent of total energy consumed by the rack, or as much as 16 percent using rear door heat exchangers and 18 percent using both DLC and RDHx in tandem.28

Some new technologies can help to increase efficiency in air-cooled environments. For example, Dell Smart Cooling technology comprises “intelligent fan control algorithms that work in unison to maintain the lowest possible fan power state without sacrificing cooling or the reliability of the server.”29 Dell also offers Smart Flow configurations that enable more airflow into the server, allowing for higher-powered processors that would otherwise require a DLC solution.30

That said, air cooling solutions that dissipate more heat may also consume more energy. According to a white paper published by the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), “The ability to move additional airflow and enabling higher rack heat loads is not without cost…increasing air cooling density not only increases the amount of power used to move that air, but also reduces cooling efficiency.”31

Scalability

Both air and liquid cooling can be used at scale, but environmental, facility, and density factors can affect how efficient or cost-effective they can be. Air cooling requires a larger facility footprint due to its inefficiencies at higher power densities and requires more energy than liquid cooling. As density increases, issues can arise in airflow management and excess heat, and noise levels and air velocity from air-cooled data centers can present complications at scale. Air cooling also faces limitations for the high-density, GPU-driven AI workloads that are growing more and more popular. According to one Data Center Dynamics article, “air cooling can support high-performance computing (HPC) and AI workloads to a degree” but traditional air cooling “may struggle to adequately dissipate the high heat density generated by AI workloads.”32

Data center layout can also be a factor in the scalability of your cooling solution. For example, if there isn’t adequate roof space for the required number of condensers or chillers, waste heat from discharge vents can affect the efficiency of other units nearby.

Liquid cooling at scale presents its own challenges. The higher thermal efficiency of direct liquid cooling enables significantly greater power density than air cooling, which can make it difficult to physically route enough power to serve the data center’s needs. Liquid cooling also requires significant capital expenditure to facilitate, making it more expensive to scale.

Operational reliability

The reliability of a cooling solution is directly linked to its performance: Can it keep a server cool enough to continue functioning as expected? If a server gets too hot, its performance will suffer, or it may shut down entirely. We saw this occur in the high-temperature testing study we referenced earlier: In an HVAC malfunction scenario, the Supermicro server we tested experienced OS SSD failure and ultimately stopped responding.33 There can also be more subtle costs to servers running at high temperatures. According to a Forbes article, “components that operate at cooler temperatures often experience less wear and tear, resulting in extended lifespan and reduced maintenance costs.”34

Air cooling systems are typically more robust and have fewer points of failure than more complex liquid cooling solutions, at the cost of decreased thermal and energy efficiency. However, at higher density and thermal load, air cooling systems can encounter difficulties managing airflow and waste heat, leading to issues such as hot spots—areas in the data center where the cooling systems fail to adequately dissipate the heat generated by the infrastructure—which in turn impact the overall operational efficiency of the hardware.

Due to their increased complexity, liquid cooling solutions have more potential points of failure, so admins must monitor them for problems such as pump failures that could rapidly compromise the ability of the system to maintain safe operating temperatures. Admins must also ensure that liquid cooling systems maintain coolant temperatures above the dew point of the ambient air, or risk condensation forming on coolant system components, which could cause electrical shorts and damage hardware. Because liquid cooling involves the use of fluids, buyers may worry about that fluid leaking and damaging the server. To address this concern, Dell PowerEdge servers with liquid cooling include Leak Sense technology, which “detects and reports any fluid leak at the cold plate via iDRAC” and automatically powers down the server if a coolant leak occurs.35

Cost-effectiveness

If acquisition cost is the driving factor in your server purchase, air cooling may be the right answer. As the default choice for many servers, air cooling is generally less expensive than liquid cooling. And if you aren’t pushing for the highest possible performance with heavy use of GPUs and powerful CPUs, such as the AMD EPYC 9755 processor, air cooling may be adequate. For “standard workloads with lower densities…air cooling remains a viable solution,” according to Dell, “keeping costs reasonable for moderate demands.”36

Liquid cooling generally requires a greater up-front investment than air cooling, but it has the potential to reduce energy costs over time. Dell notes that choosing direct liquid cooling can reduce energy costs by up to 45 percent and, in some cases, pay back its cost within 1.3 years.37

Thermal management capabilities

Air and liquid cooling options can both be effective methods for removing heat from server hardware. After doing so, however, you must remove it from the surrounding environment. Costs in this area center around the HVAC and fan operation electrical bills.

Air-cooled systems pull heated air back through air handlers to chill and redistribute it into the data center for cooling again. Chillers between racks or within racks provide supplemental cooling to high-density infrastructure to increase density capabilities, but don’t alter the equation for moving the air out of the data center. This also applies to servers with closed-loop direct liquid cooling.

When local environmental conditions allow, you can draw in fresh air from outside for cooling and move the hot air back outdoors. Some Dell systems are designed to operate in this type of extended thermal envelope, increasing the range of climates for this more cost-effective option.38

Liquid cooling that the data center provides to the rack offers a more effective way to remove heat from the components, servers, and whitespace of a facility (the area of the data center that stores server racks and other equipment). This technique removes heat directly from the loop with radiators, a heat exchanger, a secondary cooling loop, or other liquid cooling systems. All these methods provide a direct conduit for the thermal energy to leave the building, leaving the data center itself quieter and cooler.

Energy consumption and environmental impact

According to an analysis by McKinsey and Company, cooling accounts for about 40 percent of energy consumption in a data center.39 Data center energy consumption is set to more than double by 2028 due to increased data center capacities, driven in large part by AI.40 To address this, new AMD EPYC processors deliver more core density than previous generations, resulting in higher performance per watt for better energy efficiency. Traditional air cooling options may still suffice for smaller, lower density workloads (10 to 15 kW per rack), especially with Dell Smart Flow technology. But as you scale up, air cooling can introduce problems such as increased cooling costs, meaning higher energy consumption and possible performance hits. Because liquid can transfer heat more efficiently than air, a dedicated liquid cooling solution can help decrease energy consumption by up to 40 percent versus traditional air-cooled solutions.41

New air cooling options are also increasing power efficiency. This means that it’s not necessary to fully switch from air cooling to liquid cooling to see results. In one study, the American Society of Mechanical Engineers found that gradually switching from a 100 percent air-cooled environment to a 75 percent water-cooled and 25 percent air-cooled facility decreased energy consumption by 27 percent.42

These energy savings can decrease both data center operation costs and their carbon footprints. A report from Google showed that by using water-cooling solutions in its data centers, they were able to save approximately 10 percent in energy usage and reduce their carbon footprint by a similar 10 percent, leading to a reduction of 300,000 tons of CO2 emissions.43

That said, liquid cooling also has an environmental downside. When data centers consume water, it sometimes becomes “unsuitable for human consumption or agricultural use,” noted a Dallas Morning News article. The article goes on to note that while dry air is ideal for data centers, building data centers in “water-stressed regions, such as the southwest United States,” also stresses those areas’ water supplies even further.44

As AI continues to grow and demand more power, many organizations are seeking innovative solutions to minimize the environmental impact of the data center. There are even possibilities to turn the heat generated by data centers into a boon for communities: some Nordic countries have worked with data centers to use heat exchangers to turn excess data center heat into hot water for homes.45

Conclusion

Keeping server components cool is essential to both short-term and long-term hardware health, ensuring that critical workloads stay up and running. To help data centers beat the increasing heat that AI and other demanding technologies produce, the Dell PowerEdge server portfolio offers a wide range of modern cooling technologies tailored to fit a variety of needs.

From traditional air-cooled servers including Dell Smart Flow configurations for less taxing workloads, to direct liquid cooling or innovative technologies that allow organizations to leverage the benefits of liquid cooling for GPU-dense AI workloads, the PowerEdge server portfolio is poised to help you meet the cooling demands of your data centers both today and into the future. By selecting a suitable cooling approach (or combination of approaches) from the Dell PowerEdge portfolio, you can successfully keep your data centers cool, ensure reliability, and minimize ongoing operating expenses related to cooling.

  1. “Investing in the rising data center economy,” accessed December 13, 2024, https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/investing-in-the-rising-data-center-economy.
  2. “Investing in the rising data center economy.”
  3. “An introduction to liquid cooling in the data center”, https://www.datacenterdynamics.com/en/analysis/an-introduction-to-liquid-cooling-in-the-data-center/.
  4. “Do Most Homes Have 100 Or 200 Amp Service?” accessed December 18, 2024,https://smartmainpanel.com/do-most-homes-have-100-or-200-amp-service/.
  5. Casey Crownhart, “AI is an energy hog. This is what it means for climate change,” accessed January 23, 2025, https://www.technologyreview.com/2024/05/23/1092777/ai-is-an-energy-hog-this-is-what-it-means-for-climate-change/.
  6. Morgan Stanley, “Powering the AI Revolution,” accessed December 20, 2024, https://www.morganstanley.com/ideas/ai-energy-demand-infrastructure.
  7. Matthew Gooding, “Newmark: US data center power consumption to double by 2030,” accessed December 20, 2024, https://www.datacenterdynamics.com/en/news/us-data-center-power-consumption/.
  8. Chris Carreiro, “Liquid cooling: The unsung hero of AI?” accessed January 24, 2025, https://datacentrereview.com/2024/12/liquid-cooling-the-unsung-hero-of-ai/.
  9. Michael Dell, accessed December 18, 2024, https://www.linkedin.com/posts/mdell_our-new-ir7000-supports-up-to-480kw-per-21-activity-7266554414226653184-qdf5/?utm_source=share&utm_medium=member_desktop.
  10. Michael Dell, accessed December 18, 2024.
  11. 2021 Equipment Thermal Guidelines for Data Processing Environments,” accessed January 22, 2025, https://www.ashrae.org/file%20library/technical%20resources/bookstore/supplemental%20files/therm-gdlns-5th-r-e-refcard.pdf.
  12. Tara Anders, “Dell and AMD: Redefining Cool in the Data Center,” accessed December 17, 2024, https://www.dell.com/en-us/blog/dell-and-amd-redefining-cool-in-the-data-center/.
  13. “PowerEdge XE9680,” accessed January 30, 2025, https://www.delltechnologies.com/asset/en-in/products/servers/technical-support/poweredge-xe9680-spec-sheet.pdf.
  14. “Beat the Heat in Your Data Center with Dell Smart Power and Cooling,” accessed December 17, 2024, https://www.delltechnologies.com/asset/en-us/products/servers/briefs-summaries/beat-the-heat-in-your-data-center-with-dell-smart-power-and-cooling-brochure.pdf&tab0=0?hve=Read+the+brief.
  15. “Designing Innovative Liquid Cooling Solutions: From The Data Center to The Edge,” accessed December 16, 2024, https://www.delltechnologies.com/asset/es-es/solutions/oem-solutions/briefs-summaries/dell-oem-liquid-cooling-brochure.pdf.
  16. Dell PowerEdge C6615: Maximizing Value and Minimizing TCO for Dense Compute and Scale-out Workloads,” accessed January 23, 2025, https://infohub.delltechnologies.com/en-us/p/the-dell-poweredge-c6615-maximizing-value-and-minimizing-tco-for-dense-compute-and-scale-out-workloads/.
  17. “Address Your Data Center Power and Cooling Challenges Now and Into the Future,” accessed December 17, 2024, https://www.delltechnologies.com/asset/en-us/products/servers/industry-market/address-your-data-center-power-and-cooling-challenges-now-and-into-the-future.pdf.
  18. Principled Technologies, “Improving energy efficiency in the data center: Endure higher temperatures with confidence with Dell PowerEdge HS5620 servers,” accessed December 20, 2024, https://www.principledtechnologies.com/clients/reports/Dell/Dell-PowerEdge-HS5620-server-thermal-testing-0524/index.php.
  19. Delmar Hernandez, “Improved PowerEdge Server Thermal Capability with Smart Flow,” accessed December 16, 2024, https://infohub.delltechnologies.com/en-us/p/improved-poweredge-server-thermal-capability-with-smart-flow/.
  20. Delmar Hernandez, “Improved PowerEdge Server Thermal Capability with Smart Flow.”
  21. “Next-Generation Dell PowerEdge Servers Deliver Advanced Performance and Energy Efficient Design,” accessed December 16, 2024, https://www.prnewswire.com/news-releases/next-generation-dell-poweredge-servers-deliver-advanced-performance-and-energy-efficient-design-301721620.html.
  22. Travis Vigil, ”Diving Deep into the Liquid Server Cooling Choices,” accessed December 16, 2024, https://www.dell.com/en-us/blog/diving-deep-into-the-liquid-server-cooling-choices/.
  23. Emily Clark, Tim Shedd, ”Deep Dive into Direct Liquid Cooling,” accessed December 17, 2024, https://www.delltechnologies.com/asset/en-us/products/servers/industry-market/deep-dive-into-direct-liquid-cooling.pdf.
  24. “Address Your Data Center Power and Cooling Challenges Now and Into the Future,” accessed December 17, 2024, https://www.delltechnologies.com/asset/en-us/products/servers/industry-market/address-your-data-center-power-and-cooling-challenges-now-and-into-the-future.pdf.
  25. “Designing Innovative Liquid Cooling Solutions: From The Data Center to The Edge,” accessed December 16, 2024, https://www.delltechnologies.com/asset/es-es/solutions/oem-solutions/briefs-summaries/dell-oem-liquid-cooling-brochure.pdf.
  26. Jeff Clarke, “Unleashing the Future: Dell’s Journey in Liquid Cooling Innovation,” accessed December 13, 2024, https://www.dell.com/en-us/blog/unleashing-the-future-dell-s-journey-in-liquid-cooling-innovation/.
  27. Peter Judge, “Air cooling will never go away,” accessed December 18, 2024, https://www.datacenterdynamics.com/en/analysis/air-cooling-will-never-go-away/.
  28. Rui Kong, Hainan Zhang, Mingsheng Tang, Huiming Zou, Changqing Tian, and Tao Ding, “Enhancing data center cooling efficiency and ability: A comprehensive review of direct liquid cooling technologies,” accessed December 13, 2024, https://www.sciencedirect.com/science/article/abs/pii/S0360544224026203.
  29. Travis Vigil, “Diving Deep into the Liquid Server Cooling Choices.”
  30. ”Save Money with Innovative Air Cooling for your Servers” accessed December 17, 2024, https://www.dell.com/en-us/blog/save-money-with-innovative-air-cooling-for-your-servers/.
  31. ”Save Money with Innovative Air Cooling for your Servers”
  32. “Water-Cooled Servers,” accessed December 16, 2024, https://www.ashrae.org/File%20Library/Technical%20Resources/Bookstore/WhitePaper_TC099-WaterCooledServers.pdf.
  33. David Watkins, “Cooling solutions for AI workloads in evolving data centers,” accessed December 20, 2024, https://www.datacenterdynamics.com/en/opinions/cooling-solutions-for-ai-workloads-in-evolving-data-centers/.
  34. Principled Technologies, “Improving energy efficiency in the data center: Endure higher temperatures with confidence with Dell PowerEdge HS5620 servers,” accessed December 20, 2024, https://www.principledtechnologies.com/clients/reports/Dell/Dell-PowerEdge-HS5620-server-thermal-testing-0524/index.php.
  35. Bill Geary, “Revolutionizing Data Center Sustainability: The Power Of Liquid Immersion Cooling Technology,” accessed December 16, 2024, https://www.forbes.com/councils/forbestechcouncil/2024/03/06/revolutionizing-data-center-sustainability-the-power-of-liquid-immersion-cooling-technology/.
  36. “Keep Your Cool as Heat and Power Demands Increase,” accessed December 16, 2024, https://www.delltechnologies.com/asset/en-us/products/servers/briefs-summaries/keep-your-cool-as-heat-and-power-demands-increase-brochure.pdf&tab0=0?hve=Read+the+brief.
  37. Carrie Tuttle, “When to Move from Air Cooling to Liquid Cooling for Your Data Center,” accessed December 16, 2024, https://www.dell.com/en-us/blog/when-to-move-from-air-cooling-to-liquid-cooling-for-your-data-center/.
  38. “AI and HPC – With Air or Liquid Cooling,” accessed December 17, 2024, https://www.delltechnologies.com/asset/en-us/products/servers/briefs-summaries/poweredge-xe9640-and-xe8640-infographic.pdf.
  39. “Fresh Air 2 Specification Sheet,” accessed December 17, 2024, https://www.dell.com/en-us/learn/assets/shared-content~data-sheets~en/documents~fresh-air-2-specification-sheet.pdf.
  40. ”Investing in the rising data center economy,” accessed December 13, 2024, https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/investing-in-the-rising-data-center-economy.
  41. ”IDC Report Reveals AI-Driven Growth in Datacenter Energy Consumption, Predicts Surge in Datacenter Facility Spending Amid Rising Electricity Costs,” accessed December 13, 2024, https://www.idc.com/getdoc.jsp?containerId=prUS52611224.
  42. ”When to Move from Air Cooling to Liquid Cooling for Your Data Center“ accessed December 13, 2024, https://www.dell.com/en-us/blog/when-to-move-from-air-cooling-to-liquid-cooling-for-your-data-center/.
  43. ”Energy Consumption in Data Centers: Air versus Liquid Cooling,” accessed December 13, 2024, https://www.boydcorp.com/blog/energy-consumption-in-data-centers-air-versus-liquid-cooling.html.
  44. ”Our commitment to climate-conscious data center cooling,” accessed December 16, 2024, https://blog.google/outreach-initiatives/sustainability/our-commitment-to-climate-conscious-data-center-cooling/.
  45. Eric Olson, Anne Grau and Taylor Tipton, “Data centers are draining resources in water-stressed communities,” accessed December 20, 2024, https://www.dallasnews.com/opinion/commentary/2024/05/06/data-centers-are-draining-resources-in-water-stressed-communities/.
  46. Sharon Fisher, “Data Centers Leverage Cooling to Heat Homes,” accessed December 20, 2024, https://www.datacenterknowledge.com/cooling/data-centers-leverage-cooling-to-heat-homes.

This project was commissioned by Dell Technologies.

February 2025

Principled Technologies is a registered trademark of Principled Technologies, Inc.

All other product names are the trademarks of their respective owners.

Forgot your password?