The last couple of years have seen a large leap in AI processing within datacentres. While this is no doubt set to continue, there is, however, the unfortunate truth that without a step-change in energy-efficiency, AI applications in the future will increasingly be limited by both the energy cost and environmental impact of AI processing.
Without a step-change in energy efficiency, the environmental impact of AI processing – including the impact on carbon, water, and energy – will be considerable. Governments, companies, and universities are therefore under increasing pressure to limit this environmental impact. Radical change is needed, and collaboration is a must.
The use of green energy and solutions in running datacentres
On the positive side, some datacentres have already appointed dedicated sustainability officers, and there are already some solutions that are being deployed. For example, evaporative cooling is already being phased out of some datacentres due to the sheer amount of water used. Although evaporative cooling may be an attractive option when compared to traditional air conditioning (because it is typically more energy efficient and it costs less to operate and maintain), it uses a lot of water. Recycled water is one way around this, but it isn’t a perfect solution. So, is there one?
The datacentre industry is aware that much more needs to be done to stop using so much energy and producing so much heat. Renewable energy is of course a must-have solution to making datacentres carbon neutral. IT companies are major purchasers of renewable energy and are leading the way in transitioning to green energy.
In a sign of the datacentre industry’s ambition to not only reduce its environmental impact but set the bar for hitting Net Zero, The Climate Neutral Data Centre Pact now has over 100 datacentre signatories – its aim is for renewable energy to become the sole power source for datacentres by 2030 as part of its ‘climate neutral’ pledge.
Currently, solutions are being devised to position datacentres closer to renewable sources of energy. But until these sources are even more accessible and widespread, they cannot be solely relied upon. And even if these sources are used, datacentres should still be aiming to be as energy efficient as possible. Indeed, there is a view that until more of our energy comes from renewable sources, some datacentres using green energy simply prevent other users from accessing these green energy sources.
What other solutions are helping to improve datacentre sustainability?
Some datacentres are based in countries where they can use their natural cool air climate to carry out cooling – a process known as free cooling. Innovative approaches are also attempting to capture and reuse the vast amount of heat produced by datacentres for other heating uses, systems and buildings nearby.
These all have potential to be viable solutions – but all are inherently inefficient. The rapid growth in the use and complexity of AI and demand for capacity requires a technical performance solution too.
The role of technology in this mission
There are some solutions providing real-time performance improvements to existing infrastructure. Asset performance management software, for example, gives an overview of the energy consumption of systems, allowing technicians to monitor devices and thereby optimise efficiency through adjusting heat and cooling levels.
However, with the use of traditional electronics in datacentres, the levels of energy consumed and the subsequent heat produced by silicon-based GPU AI accelerators means there are limits to how much AI computing performance can be delivered. An incredible increase in power consumption is needed to meet AI processing requirements, flying in the face of companies that rely on datacentres but need to meet environmental targets.
The industry is in need of a paradigm shift – and the world of 3D optical AI computing is offering this necessary transition.
3D optical computing computes with photons instead of electrons, meaning it consumes far less energy than GPU solutions and avoids the inherent limitations of traditional electronics, such as being constrained to the 2D geometry of silicon chips. The performance-per-watt achievable means datacentres can unlock compute speeds many times faster than traditional means – up to 1000x – and at greater scales, all while consuming a fraction of the energy. With the corresponding reduction in heat generated, optical compute is a sustainable solution for next-generation AI.
What 2024 (and beyond) looks like for sustainable datacentres
While governments harbour ambitions to become carbon neutral by 2050, the datacentre industry is on a much quicker track, aiming to reach this goal by 2030. This means that bigger changes will take hold quicker, and 2024 will be the start of further acceleration.
The rapid development of generative AI over the last year shows just how quickly the technology can move and the increasing compute demand that comes with this. There is widely held view that providing this level of high-performance is a significant environmental challenge, and great efforts are being put into solving it. Datacentre providers and operators, for example, are placing this priority at the heart of their purchasing criteria.
The use of renewable energy will increase further, as will ways of reusing and recirculating heat and energy. But without a fundamental change in the way AI processing is powered in datacentres, it will be hard to achieve the goals set out in the Pact and achieve a step-change in energy efficiency. This is why a paradigm shift to other ways of delivering AI processing, such as with 3D optical computing, is so crucial. This all needs to be driven by broader industry collaboration and government investment.
For sustainable datacentres to become a reality, all of these changes, innovations and technological advancements need to happen together.