New energy for the AI megatrend


Thermodynamic computing & co

Digitalisation & Technology, 01.08.2024

Artificial intelligence (AI) has quickly developed into a new technological cornerstone. AI is accelerating digitalisation, improving and expanding existing technologies and opening up almost unlimited possibilities for new developments. The catch: energy requirements are also almost unlimited. How can AI's hunger for energy be satisfied in the most climate-friendly way in times of global energy transition?

Thermodynamic computing & co: New energy for the AI megatrend

We have already reported on the high energy requirements of AI models. A lot has happened since then, but little has changed: while AI models are developing rapidly and constantly opening up new possibilities, their hunger for energy is still very high. And as new AI applications continue to come onto the market, the energy problem will continue to worsen.

Did you know that generating an image with an AI tool consumes about as much power as a smartphone battery?

The higher the computing power, the higher the energy requirement

According to the World Economic Forum, the computing power required to operate AI systems doubles approximately every 100 days. Computing power also means energy requirements. There are now various studies that deal with the energy consumption of AI. However, the estimates are difficult and vary accordingly. For example, global energy consumption could currently be equivalent to the annual consumption of Switzerland.

However, growth is more decisive: experts consider a tenfold increase in energy requirements by 2030 to be realistic. The extent to which they have already taken into account the progressive integration of AI functions in digital systems such as smartphones, office applications, search engines and operating systems remains unclear. In any case, the experts are urgently calling for the development of sustainable solutions to counteract the considerable ecological impact.

There are several possible solutions. They range from bioprocessors to underwater data centres and integrated eco-power plants to thermodynamic computing. They are divided into measures that can take immediate effect and long-term developments that should enable us to achieve a sustainable AI future.

‘Sinking’ data centres into the sea

The number of data centres has been growing for years. In an analysis, the Borderstep Institute for Innovation and Sustainability estimates the number at around 80 million worldwide (PDF). Due to increasing digitalisation and the hype surrounding AI, the number is likely to rise further in the coming years. AI expert Ralf Herbrich from the Hasso Plattner Institute estimates that the share of data centres in global energy consumption could skyrocket from four to five percent today to up to 30 percent in the next few years.

In view of these figures, it quickly becomes clear that improving the energy efficiency of data centres would be a powerful lever. Data centres could save energy quickly and directly by reducing their performance, but this would significantly slow down the response speed of AI applications. A completely different savings option is more realistic: data centres consume a lot of energy for cooling.

Cooling is usually provided by air circulation systems, which absorb the heat generated and transfer it to a cooling medium, usually water. There have therefore already been attempts to build data centres directly under water. A huge underwater data centre is currently being planned in China that will have the computing power of 6 million conventional computers. It is due to go into operation in 2025 and will save up to 122 megawatt hours of electricity per year thanks to natural water cooling.

Forecast: How successful the project will be certainly also depends on the construction and maintenance costs, which are likely to differ from those of conventional data centres on land. It is more likely that underwater data centres will remain the exception and are primarily suitable for coastal conurbations where space is already at a premium.

Traditional computer systems are reaching their limits

The technological basis of today's computer systems is surprisingly old. The so-called ‘Von Neumann Architecture’ (VNA) goes back to the work of mathematician John von Neumann and was first published in 1945. In this computer architecture, the basic elements such as processors and memory work separately from each other, so they must constantly exchange data. They also process information digitally in the form of bits (0 and 1), perform calculations using logical operations and work deterministically.

The energy consumption of this classic architecture is extremely high, especially for AI applications that utilise large amounts of data.
Thermodynamic computing offers an alternative. Here, processors and memory work as a unit and utilise the principles of thermodynamics, i.e. the theory of heat and energy. Instead of purely logical operations, thermodynamic computing is based on physical processes and fluctuations at molecular level. It works with probabilities and not with deterministic states.

This computer architecture has several advantages in terms of energy efficiency:

While heat is a problem for classical computers, thermodynamic computers can utilise the energy of thermal fluctuations for calculations.

Thermodynamic systems can work in parallel by nature, which can lead to higher performance for certain tasks. The required parallelism is currently still generated by the widely used graphics processing units (GPU), which were originally developed for processing images and videos.

Thermodynamic computers are better suited to dealing with uncertain or probabilistic information, i.e. information based on probabilities.

Forecast: Thermodynamic computing has the potential to drastically improve the energy efficiency of computer systems, especially for AI applications.

Neuromorphic computing: building computers like a brain

Another exciting approach is neuromorphic computing. In contrast to thermodynamic computing, the source of inspiration here is not physics, but the human brain and nervous system. The basic principle:

Neuromorphic computing attempts to replicate the structure and functioning of the human brain in hardware. The aim is to develop computers that work as efficiently and adaptably as biological neural networks. Here too, the aim is to merge computing processes and information storage in order to overcome the energy-intensive data transfer of conventional computer architectures.

Neuromorphic chips with artificial synapses are modelled on the human brain in two respects. On the one hand, the brain is a biological supercomputer with unimaginable capacity. On the other hand, the brain only needs 20 watts - an almost unimaginable level of energy efficiency.

Forecast: Neuromorphic computing is already being used in experimental chips and is being intensively developed further in a research project at the Jülich Research Centre.

Making data centres sustainable

The sustainable energy supply of data centres has so far been a subordinate topic at best. As demand increases and the energy transition progresses, the requirements profile for new data centres, which are urgently needed for AI systems, is changing.

A sustainable data centre is therefore currently being built in Mainz, very close to the important internet hub of Frankfurt am Main. The operator, Kraftwerke Mainz Wiesbaden (KMW), is not an Internet company, but an energy supplier. The data centre is supplied with 100 percent green electricity from KMW's own wind turbines. In addition, the waste heat is not released into the environment as usual, but is fed into the local district heating network.

This one data centre alone will produce up to 60 MW of waste heat, which can be used to heat numerous households in the surrounding area. By way of comparison, the world's largest large-scale heat pump is currently being built in Esbjerg, Denmark, which will also generate 60 MW of heating energy and supply up to 25,000 households.

This is a great opportunity for the future: if all data centres feed their waste heat into district heating networks, this will also improve the carbon footprint of AI systems. At the same time, the energy transition is significantly supported. A smart solution, isn't it?

Text: Falk Hedemann

Your opinion
If you would like to share your opinion on this topic with us, please send us a message to next@ergo.de.

Related articles

Digitalisation & Technology 26.07.2024

The future is now

ERGO is known to rely on software-bots, but humanoid robots, which were once considered science fiction, are now a reality and have long since begun to revolutionise the way we live and work. In contrast to purpose-oriented robots, such as household robots, these machines based on artificial intelligence (AI) are more than just tools or aids that fulfil repetitive tasks: They are creatures that possess human-like characteristics and can therefore profoundly change the way we live and interact.

Digitalisation & Technology 09.04.2024

German Excellence Award

Renewed recognition for ERGO's innovative strength: as in previous years, our digitalisers were awarded the German Excellence Prize - both for the VR consulting app and for our process mining.

Digitalisation & Technology 26.03.2024

Gen Z - the generation of the future

Generation Z presents insurers with new challenges, postulates ERGO CDO Mark Klein: "In order to meet the demands of this generation, insurance companies must utilise the entire range of digital options available to them. This makes Gen Z a driver of digital transformation."