AI’s voracious need for computing power is threatening to overwhelm energy sources, requiring the industry to change its approach to the technology, according to Arm Holdings Plc chief executive officer Rene Haas.
By 2030, the world’s data centers are on course to use more electricity than India, the world’s most populous country, Haas said. Finding ways to head off that projected tripling of energy use is paramount if artificial intelligence is going to achieve its promise, he said.
"We are still incredibly in the early days in terms of the capabilities,” Haas said in an interview. For AI systems to get better, they will need more training – a stage that involves bombarding the software with data – and that’s going to run up against the limits of energy capacity, he said.
Haas joins a growing number of people raising alarms about the toll AI could take on the world’s infrastructure. But he also has an interest in the industry shifting more to Arm chips designs, which are gaining a bigger foothold in data centres. The company’s technology – already prevalent in smartphones – was developed to use energy more efficiently than traditional server chips.
Arm, which began trading on the Nasdaq last year after 2023’s largest US initial public offering, sees AI and data center computing as one of its biggest growth drivers. Amazon.com Inc’s AWS, Microsoft Corp and Alphabet Inc are using Arm’s technology as the basis of in-house chips that help run their server farms. As part of that shift, they’re decreasing reliance on off-the-shelf parts made by Intel Corp and Advanced Micro Devices Inc.
By using more custom-built chips, companies can lessen bottlenecks and save energy, according to Haas. Such a strategy could reduce data center power by more than 15%.
"There needs to be broad breakthroughs,” he said. "Any piece of efficiency matters.” – Bloomberg