ao link
Business Reporter
Business Reporter
Business Reporter
Search Business Report
My Account
Remember Login
My Account
Remember Login

Building the essential server capacity for AI

Linked InTwitterFacebook

Nessim-Sariel Gaon at LIAN Group predicts the number of AI servers will have to increase tenfold to maintain the market’s growth

 

AI is at the height of its influence. From ChatGPT’s breakthrough in 2022 all the way to NVIDIA becoming one of the world’s most valuable companies, the sector’s rise has been nothing short of meteoric. A behind-the-scenes example is the infrastructure company Vertiv, which like others has excelled in recent years, delivering strong performance and rewarding their investors with high returns.

 

While some see uncertainty about AI’s future, I wholeheartedly believe in the ‘AI revolution’. I’m optimistic about the market and confident in its resilience and potential to keep growing from strength to strength.

 

AI is the driving force behind some of today’s major innovations. Nevertheless, as models only get bigger while competition across the sector only gets fiercer, the need for greater capacity has already become critical – requiring more bespoke digital infrastructure to sustain growth. Data centres will be at the core of AI’s next growth phase. 

 

Currently, many of the servers worldwide are not adapted to support AI compute. Looking ahead, the next generation of AI models is projected to require a cluster increase in the scale of 300,000 GPUs. Merely adding 20,000 GPUs to a cluster and squeezing it into an existing data centre, which is what some DC operators are doing, simply won’t work. Retrofitting or co-locating current infrastructure won’t be sufficient to meet these growing needs. 

 

Ultimately, to support this level of scale, we’ll need to construct new, purpose-built data centres.

 

Big Tech companies have caught this drift. During the most recent tech earnings season, Microsoft, Meta, Google, and Amazon have all announced they’re substantially raising their spending in the sector, which will flow largely into the facilities, the data centres, that can fuel AI’s growth. This is clear in their annualised capital expenditure numbers, which have surged by a staggering $91 billion year-over-year — a surefire sign they’re betting on data centres to drive AI’s expansion.

 

And they’re right to do so. If we stopped investing and just maintained current capacity levels, driven by the rapid growth of the overall sector and the new startups that constantly spring up, vacancy rates in AI data centres would stoop close to 0%. New, innovative, fledgling startups could struggle to enter or disrupt the sector – and we would be placing a real ceiling on AI’s growth. In simple terms, we could be depriving the world of the next OpenAI.

 

It is now essential for the public to grasp the concept of capacity as the backbone of the entire tech industry. The myriads of servers, wires, and cooling systems housed in warehouses worldwide are the factors (heroes) behind the scenes that make it possible to support AI’s innovation, which continues day after day to drive the global economy. As a result, building more data centres will be vital for AI’s ongoing expansion and future successes and will be a decisive factor in AI’s progress over the coming years.

 

But, despite all these vital nods from the tech giants of this world, I still think more is needed.

 

As I mentioned, models are only getting larger – and, to put that in perspective, on average, large-scale AI models require about 100x more compute than others. On an infrastructure level, the stats are equally revealing: out of the 10,000-30,000 data centres worldwide, only approximately 325-1400 could host an AI supercomputer. The fact is, the current capacity and compute we have available globally just don’t match up to model growth. We must level that gap.

 

So, the answer is clear: to maintain AI’s current expansion path – both in the market and the models themselves – we need to build, build and build. Fast.

 

As a sector, like with any boom, some investors have grown sceptical about how long AI can keep its winning streak going. And with words like ‘bubble’ flying about, analysts have become increasingly polarised on the market – all despite the huge returns it’s brought.

 

Although I disagree with their concerns, investors have to be brought back on board – and to do that, we need to ensure AI isn’t hampered by any limitation. We can’t place any barrier on the market or lose any favour. We can’t fall into the speculation.

 

That’s why, for me, we have to increase both the number of AI machines and cluster size tenfold. Bolstering the stability and size of the sector’s digital infrastructure is the way to sustain AI’s growth and maintain confidence in the market. From the servers that power AI models and the data centres that house them, capacity will be a big guarantor of the market’s consistent success.

 

We should aim to build these data centres and increase cluster size in the next two years – or at least by the end of 2026. Steel, servers, machines and GPUs will all define the next era of AI, so it’s time we plan ahead – and gear up to maintain our upward trajectory. Data centres will be the backbone of the sector’s future. 

 


 

Nessim-Sariel Gaon is the co-founder and managing partner of investment firm LIAN Group

 

Main image courtesy of iStockPhoto.com and quantic69

Linked InTwitterFacebook
Business Reporter

23-29 Hendon Lane, London, N3 1RT

23-29 Hendon Lane, London, N3 1RT

020 8349 4363

© 2024, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543

We use cookies so we can provide you with the best online experience. By continuing to browse this site you are agreeing to our use of cookies. Click on the banner to find out more.
Cookie Settings