AI training and inference servers use accelerators and processors with high thermal design power (TDP). Air-cooling these chips becomes less practical when considering heat sink dimensions, server airflow and energy efficiency, forcing a transition to liquid-cooling. Liquid cooling servers offer benefits including improved accelerator reliability & performance, increased energy efficiency, reduced water usage, and reduced sound level.
Navigating Liquid Cooling Architectures for Data Centers with AI Workloads
You have been directed to this site by Software Insider. For more details on our information practices, please see our Privacy Policy, and by accessing this content you agree to our Terms of Use. You can unsubscribe at any time.