The global experiment in artificial intelligence is just beginning. But the spending frenzy by big tech companies for building and leasing of data centres, the engine rooms for AI, is well underway. They poured an estimated US$105bil (RM458.95bil) last year into these vast, power-hungry facilities.
That spending spree is increasing demand for electricity and raising environmental concerns. A recent headline in The New Yorker called the energy demands of AI “obscene”. But there’s another perspective on AI and the environment, focusing not on how the technology is made but on what it can do.
AI has the potential to help accelerate scientific discovery and innovation in one field after another, lifting efficiency and reducing planet-warming carbon emissions in sectors like transportation, agriculture and energy production.
Here’s what to know.
What makes those data centres so power-hungry?
It’s the rise of so-called generative artificial intelligence.
Generative AI can do a lot – not only analyse data and make predictions, but also write poetry and computer code, summarise books and answer questions, often with human-level proficiency. And that kind of computing needs a lot of energy. A query to ChatGPT requires nearly 10 times as much electricity as a regular Google search, according to a recent estimate.
Researchers had been working on generative AI for years, but it really burst onto the scene in November 2022 when OpenAI introduced ChatGPT, the conversational chatbot that became a sensation. Microsoft has invested more than US$13bil (RM56.32bil) in OpenAI and is racing to include AI features in its products. So are Amazon, Google and Meta, the owner of Facebook, Instagram and WhatsApp.
How much will electricity demand increase?
There are higher estimates, but experts generally forecast that energy consumption by data centres worldwide will at least double over the next few years. Goldman Sachs has estimated that electricity use by data centres will increase 160% by 2030. A recent forecast by the International Energy Agency projected that demand would more than double by 2026.
These predictions are all sizable increases, suggesting sharply higher greenhouse gas emissions from data centres if they obtain their power from fossil fuels like coal and natural gas. But keep in mind: The global electricity sector is huge and varied. Data centres account for about 1% to 2% of total electricity demand. That share, according to estimates, will increase to 3% to 4% by 2030.
What’s the case for AI as a green technology?
Artificial intelligence is a general-purpose tool, experts say, that if used wisely across the economy could reduce greenhouse gas emissions 5% to 10% by 2030, according to a study by the Boston Consulting Group that was commissioned by Google.
For example, the technology promises to “give biological design a boost”, said Drew Endy, an associate professor of bioengineering at Stanford University. The result, he said, might well be to turbocharge biology by discovering the right DNA formulas to unlock more efficient, less-polluting agriculture, for example.
AI could also radically transform the way we find metals that are critical not only to the tech industry but to the fight against climate change. In one case, it helped find a vast deposit of copper, a key component in electric vehicles, in Zambia.
And Zanskar, a startup in Salt Lake City, is using AI to try to improve the success rate of discovering geothermal energy for power plants. About 90% of geothermal projects started from scratch fail mainly because they drill in the wrong places, said Carl Hoiland, a co-founder and the CEO of Zanskar. But AI, combined with new geologic data sets like satellite and seismic sensor data, could open the door to doubling or tripling the field’s meager success rate.
In theory, that could make a big difference in the fight against global warming. Geothermal is a clean, round-the-clock energy source, but it currently accounts for less than half of 1% of the electric power in the United States.
The takeaway
Even though electricity demand from AI is expected to at least double in the coming years, the efficiency of the technology could increase at an even higher rate. There is a historical precedent.
Consider what happened with cloud computing. There was a surge in energy consumption in the early 2000s. And there were concerns that the increase would continue. But while the computing output of the world’s data centres jumped sixfold from 2010 to 2018, energy consumption rose only 6%.
A similar trend, industry analysts say, may well emerge with AI.
“After the mania has calmed down, other incentives kick in,” said Jonathan Koomey, a former scientist at the Lawrence Berkeley National Laboratory who is now an independent researcher. “There is a huge incentive for the industry to become more efficient.”
The big tech companies are working on ways to streamline their software, hardware and cooling systems to reduce electricity consumption in their data centres. They are locating computing facilities in northern countries, pulling in cold outdoor air as a coolant to reduce electricity and water use. And they are investing in alternative energy sources.
If those efforts are successful, and if we’re smart about how we use AI, it might eventually offer a lot of environmental bang for the buck. – The New York Times