Professional Services


Get Support for your AI Journey

We are here to help

Today, artificial intelligence (AI) is ushering in a fundamental surge in data center capacity requirements.

Uptime Institute, the Global Digital Infrastructure Authority has global experts in over 40 countries and has certified and assessed thousands of data centers around the world.

Uptime continues to work with enterprise data center owners, cloud, and colocation services providers to guide all aspects of AI across all phases of the lifecycle, from business case development to data center design, construction, and facilities operations.

Uptime Institute Tier Standard: Topology and Tier Standard: Operational Sustainability have been applied to AI data centers with liquid cooling, supporting ultra-high-density GPU and CPU clusters for training and inference use cases. This ensures enterprises and colocation service providers can maintain their resiliency and availability standards while accommodating AI Workloads.

Read more about Uptime’s unique viewpoints from the field.

AI data center

Contact Us

Have questions or need help? Fill out the form and we will follow up with you right away.

View Our Report Series

AI embraces liquid cooling, but enterprise IT is slow to follow

The enthusiasm for generative AI is attracting serious investment, and the associated power and cooling requirements will pose a significant challenge for the data centers that house it. Upcoming AI training clusters will escalate silicon power and rack density to unprecedented heights, upending infrastructure design conventions and accelerating the adoption of cold plate and immersion cooling systems.
Image
Neoclouds: a cost-effective AI infrastructure alternative

Neoclouds: a cost-effective AI infrastructure alternative

Over the past decade, three tech giants have solidified their dominance in the cloud computing market: Amazon Web Services (AWS), Microsoft Azure and Google Cloud. Estimates suggest that by the end of 2023, these companies collectively controlled around two-thirds of global cloud spending, a notable increase from 47% in 2016.

Generative AI and global power consumption: high, but not that high

Uptime Intelligence has been asked more questions about generative AI and its impact on the data center sector than any other topic. The questions come from enterprise and colocation operators, suppliers of a wide variety of equipment and services, regulators and the media.
Generative AI and global power consumption: high, but not that high

The following resources are available to our Uptime Network and Intelligence members:

The easiest way to access these reports, and hundreds more, is through Uptime Intelligence, the leading source of research, insight and data driven analysis focused on digital infrastructure.

If you would like to access the full library of reports and resources, you can use the link below to request a free four-week evaluation of Uptime Intelligence.
Do you own and operate your own digital infrastructure? Gain access to this report and all of our Intelligence as part of Uptime Network - learn more here.

Existing Network member? You can access these reports directly via Uptime Network.

Frequently Asked Questions

1

What is the current impact of AI on the data center industry?

AI is driving a significant increase in data center capacity requirements. Most AI data center investments in 2024 have been driven by hyperscale, either through self-builds or colocation providers hosting hyperscale tenants. While there are some enterprise AI data center builds, they are not yet widespread. While there is rightly a focus on multi-billion-dollar investment in high visibility new builds from the large cloud players and some nation-states for sovereign AI, there are many organizations considering how to retrofit current data centers or add additional AI-specific capacity to meet AI demands.
2

What is Uptime Institute's perspective on AI deployments in data centers?

Uptime Institute believes that most training of large-language models will take place in the cloud (hyperscale), while small-language models and select models with distinct requirements (e.g., sovereign AI) will be on-premises. Inference will be more distributed and won't necessarily require AI factories.
3

How does AI impact data center infrastructure?

Training AI models requires large clusters of high-density compute, while inference can be more distributed and run on traditional IT infrastructure. High-density AI clusters require higher cooling capacities and significant changes to MEP/white space, power infrastructure, power distribution, connectivity, building characteristics, and data center operations.
4

How can Uptime Institute help data center customers on their AI deployment journey?

Uptime Institute has the expertise to guide all aspects of data center design, construction, and operations, including AI data centers. Uptime Institute is currently helping numerous customers globally with AI buildouts across all phases of the lifecycle, from business case development to design and facilities to operations.
5

Can AI data center infrastructure meet Uptime Institute’s Tier Standards?

Yes, it is possible to meet Tier Standards with liquid cooling. Existing Tier Certification of Design Documents, Constructed Facility, and Operational Sustainability services can be applied to AI data centers. Resiliency is imminently relevant for AI workloads – both training and inference – and Uptime Institute Tier Standard: Topology and Tier Standard: Operational Sustainability are still applicable. This is evidenced by primary research conducted by Uptime Intelligence in addition to direct conversations with Enterprise and colocation service providers who seek to maintain resiliency and availability standards while accommodating AI Workloads. Contact us to discuss Tier Standards and your AI data center.
6

Can AI data center infrastructure meet Uptime Institute’s Tier Standards?

AI data centers can utilize various cooling technologies based on specific AI workload requirements. Perimeter cooling remains the favored method for traditional, low-density workloads. For high rack power (over 50 kW) or high-performance IT with specialized cooling needs, liquid cooling is typically employed.
7

How do you design for resilient and scalable power, cooling, and network architectures in AI data centers?

Ensuring your AI data center has resilient and scalable power, cooling, and network architecture is crucial. Uptime Institute can assist by conducting reviews to verify that your cooling, electrical systems, and infrastructure meet your resiliency goals.
8

Can existing data centers be retrofitted for AI workloads?

Yes, existing data centers can be retrofitted for AI workloads. Uptime Institute can help you determine if infrastructure and operations for your existing data center can support AI workloads and identify gaps to support a future AI workload.
9

What are the CapEx and OpEx implications of building AI data centers?

Building AI data centers involves significant Capital and Operational Expenditures for advanced hardware, cooling systems, and infrastructure to maintain continuous and efficient operations. Uptime Institute can help ensure you are making the right infrastructure and network decisions for your business and market.

Uptime AI Findings in the News

Computer Weekly | Datacentre outages decreasing in frequency, Uptime Institute Intelligence data shows

Datacentre outages are becoming less common and severe, but power supply issues remain enduring cause of most downtime incidents

Read More at Computer Weekly

Computer Weekly | Space and power constrain datacentre planning

The government needs to tackle the resource issues that act as roadblocks to building out UK datacentre capacitys.

Read More at Computer Weekly

Network World | Sustainability, grid demands, AI workloads will challenge data center growth in 2025

Uptime Institute predicts the data center industry in 2025 will face pressure over resource consumption, grid integration challenges, and AI infrastructure requirements.

Read More at Network World

Data Centre Dynamics | Understanding AI deployment methods and locations

No technology has dominated recent headlines more than AI, most notably large language models (LLMs) such as ChatGPT. One of the critical enablers of LLMs is powerful clusters of GPUs, which can train AI models to classify and predict new content with a reasonable degree of accuracy.

Read More at Data Centre Dynamics

Contact Us for More Information