Generative AI could soon use more power than a country


AI requires the development of many dedicated servers, which consume a lot of energy. — AFP Relaxnews

A Dutch researcher has highlighted the enormous energy use associated with the whole new generation of tools powered by generative artificial intelligence. Eventually, if they were to be adopted widely, these tools could end up using as much energy as a whole country, or even several countries combined.

Alex de Vries, a PhD candidate at Vrije Universiteit Amsterdam, has published research in the journal Joule into the environmental impact of emerging technologies such as generative artificial intelligence.

The arrival, in less than a year, of tools such as ChatGPT (from OpenAI), Bing Chat (Microsoft) and Bard (Google), as well as Midjourney and others in the image sector, has greatly boosted demand for servers, and consequently the energy required to keep them running smoothly. This development inevitably raises concerns about the environmental impact of this technology, which is already being used by many people.

In recent years, excluding cryptocurrency mining, electricity consumption by data centers has been relatively stable, at around 1% of global consumption. However, the expansion of AI, which is unavoidable in many fields, is likely to change the game.

According to Alex de Vries, the GPT-3 language model alone consumed more than 1,287 MWh during training. After this phase, the tool is put to work – with, to stay with ChatGPT, the generation of responses to prompts from Internet users.

At the start of the year, SemiAnalysis estimated that OpenAI needed 3,617 servers, with a total of 28,936 graphics processing units (GPUs), to support ChatGPT, which would correspond to a power demand of some 564 MWh per day.

And that, of course, is just the beginning. Still according to SemiAnalysis, implementing an AI similar to ChatGPT in every Google search would require the use of 512,821 dedicated servers, or a total of over 4 million GPUs.

With a power demand of 6.5 kW per server, this would translate into a daily electricity consumption of 80 GWh and an annual consumption of 29.2 TWh (terawatt-hours, or one billion kilowatt-hours). According to the most pessimistic scenario, AI deployed on a mass scale by Google could consume as much power as a country like Ireland (29.3 TWh per year).

Alphabet has already confirmed that interaction with a language model could consume up to 10 times more power than a standard keyword search, increasing from 0.3 Wh to around 3 Wh. At Nvidia, the leading supplier of AI servers, more than 1.5 million units could be sold by 2027, for a total consumption ranging from 85 to 134 TWh per year.

In conclusion, AI-related electricity consumption is fast becoming a major concern. Nevertheless, there are a number of ways that this can be reduced. The first would obviously be to prioritise renewable energy sources to power data centers.

Next, comes the need to find ways of developing algorithms that consume less energy. Finally, Internet users could be educated on using AI responsibly, and without excess. – AFP Relaxnews

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

'Mario & Luigi: Brothership' review: Mario & Luigi energise an island-hopping quest
'Call of Duty: Black Ops 6' review: When war becomes an aesthetic, nobody wins
TikTok parent ByteDance’s valuation hits $300 billion amid US ban uncertainty, WSJ reports
Turkey fines Amazon's Twitch 2 million lira for data breach
What to know about Elon Musk’s contracts with the US federal government
What is DOGE? Houston experts say Trump's new 'department' is not actually a department
Netflix back up for most users in US after outage, Downdetector shows
From a US$1mil DoorDash scam to a massive crypto heist, Gen Z linked to sophisticated online crimes
Uncle: US teen had met man responsible for her death playing games online
T-Mobile hacked in massive Chinese breach of telecom networks, WSJ reports

Others Also Read