There has been much debate about what AI might evolve into in the future — an AI summit in the UK in November 2023 drew technology leaders from around the world to discuss possible outcomes, good and bad. Fittingly, the meeting was held at Bletchley Park, home to code breakers during the Second World War using the early computer ‘Colossus,’ perhaps a precursor to the AI of today with its 2500 tubes, 17 square meter footprint, 5 metric ton weight, and 8 kW power consumption. The ‘Turing test’ to determine if a computer can ‘think’ is named after the Bletchley Park codebreaker Alan Turing, and it is argued that the ‘Large Language Model’ ChatGPT now passes with ease.
AI limits — energy and hardware
So, what are the controlling influences on AI? Ethical considerations apart, there are practical limits in play, not least of which is the hardware involved and providing power to run it. Before AI, data centers showed huge increases in throughput to meet demands for services like video streaming and cloud-gaming, but the hardware kept up and developments in programming and power delivery system efficiency resulted in barely increased overall data center energy draw. For example, consumption rose by just 20% between 2015 and 2022 according to the International Energy Agency (IEA). This is despite a 340% increase in data center workload between the same dates.
Data centers, AI, and energy — the numbers
Things are changing however with AI – the training phase of a Large Language Model (LLM) is energy intensive, ChatGPT reportedly consuming over 1.2 TWh. However, in the operational or ‘inference’ phase, the figure is around 564 MWh per day using nearly 30,000 NVIDIA GPUs in over 3,600 servers. As a result of figures like these and all the other server applications, the IEA states that growth of data center energy use is now at a rate of 20–40% annually. Company Schneider estimates that the AI CAGR element is 25% to 30% to 2028.
Learn more about AI power considerations from Flex Power Modules