What Does the End of Moore’s Law Mean for the Data Center Industry? | Data center knowledge

In case you missed it, Moore’s Law – which states that computing power steadily increases over time – is dead, or at best, slowly dying out. Computer chips are no longer gaining computing power as quickly as they have in decades past.

What does this change mean for data centers? Possibly quite a lot. Read on to learn how slowing computing power growth could impact the data center industry.

What is Moore’s Law and why is it dead?

Moore’s Law, named after Intel co-founder Gordon Moore, who proposed the concept in 1965, is the principle that the number of transistors that engineers can put into computer chips doubles about every two years. More broadly, the processing power of the average chip should increase at the same rate, and the cost that companies pay for the processing power should decrease.

For decades, Moore’s theorem proved to be largely accurate. Computing capacity increased at about the rate he predicted.

But that’s no longer true. While it may be too early to say that Moore’s Law is definitely dead, there’s reason to believe we’ve reached the physical limits of silicon-based CPUs. Without a practical alternative, engineers won’t be able to increase the computing power of chips as quickly and cheaply as they have in years past.

It’s entirely possible that bright minds will find ways around silicon’s current limitations — or that quantum computing will finally become practical and completely change the game around computing power. For now, however, data shows that the rate of increase in computing power is slowing, and there are no clear signs that this trend will change any time soon.

READ :  IEEE Computer Society Announces 2023 Emerging Tech Grant Recipients

Moore’s Law and Data Centers

The fact that CPU capacity isn’t growing as fast could have several profound implications for data centers.

More data centers

Perhaps the most obvious is that more data centers are likely to be built.

That would probably happen even if Moore’s Law applied. The demand for digital services has long outstripped the growth in computing power, which means that companies have had to increase the footprint of their IT infrastructures even as the computing power per server of these infrastructures has increased.

But in a Moore’s Law world, we need even more data centers. If servers stop getting more powerful every year, the only way to meet increasing user demand is to deploy more servers, which means building more data centers.

Data center sustainability challenges

Increasing the total number of data centers will exacerbate existing challenges related to data center sustainability. More servers result in higher power consumption, especially if the number of transistors per chip stays the same.

This will likely make data center providers that can offer clean energy procurement even more attractive. This also applies to next-generation data center technologies such as B. Immersion cooling, which can reduce the carbon footprint of data center facilities.

More and more companies are entering the chip market

For decades, a relatively small number of vendors – namely Intel and AMD – dominated the market for computer chips used in commodity servers. These companies were able to deliver steadily increasing computing power, which offered little incentive for other companies to get into chip production.

READ :  3 Top E-Commerce Stocks to Buy in October

But that’s changed in recent years as companies like AWS have started building their own chips, and the obsolescence of Moore’s Law should push such companies to invest even more in CPU technology. The reason is that they will be looking for newer and better ways to squeeze out the efficiency of chips, especially in the context of the specific use cases they are using the CPUs for.

In other words, in a world where generic CPUs are no longer getting more powerful and cheaper by the year, companies have more incentive to design their own specialty CPUs, optimized for the use cases that matter most to them .

Workload optimization is becoming increasingly important

Reducing the CPU consumption of workloads has always been a smart move for companies looking to save money on hosting costs. But in a Moore’s Law world, workload optimization becomes even more important.

That means we’re likely to move more workloads into containers, for example. The FinOps and cloud cost optimization market is also expected to boom as more organizations seek strategies to maximize the efficiency of their workloads.

Diploma

The data center industry grew up in a world where the performance of computer chips was constantly increasing and the cost was decreasing. But this world is gone. We live in or near the age of Moore’s Law.

The result is likely to be more data centers, more specialty CPUs, and greater pressure on companies to optimize their data centers. Data center providers and their customers need to adapt – or alternatively keep their fingers crossed that the quantum revolution finally arrives and makes computing power ridiculously cheap, although that’s unlikely to be a winning strategy.

READ :  EDA Tools For Quantum Chips?