AI’s energy problem – when technological innovations use as much electricity as an entire country

Artificial intelligence is completely transforming the world of technology, with most predictions saying it will be as transformative over the coming years as the shift from the steam engine to electricity was back in the day. But electricity is precisely one aspect that mustn’t be overlooked here, or rather AI’s energy consumption levels. If a true digital transformation is to take place on the back of innovation – such as artificial intelligence – it really must do so in a sustainable way, something that remains a real challenge, at least for the moment.

 

 

Training artificial intelligence software can consume the same amount of energy as 40 homes

Generative artificial intelligence – whether you’re creating images, text or anything else – doesn’t just magically create what users request out of thin air. In order to understand what it needs to generate, AI relies on previous “learning”. And that learning or training of the model requires millions and millions of pieces of data, examples, images, words, expressions and more.

 

As you can imagine, this training process consumes a vast amount of energy. For example, artificial intelligence company Huggin Face claims that its multilingual text generation tool consumes 433MWh (megawatt hours) during the training process alone. This amount of energy would be enough to power 40 standard American homes for an entire year. On average, Google processes around 9 billion requests for information per day via its search engine. If an artificial intelligence assistant were used instead – as they’d like – more than 29TWh (terawatt hours) would be consumed over the course of a year. That’s the same amount of energy a country the size of Ireland generates and consumes annually, to give you an idea of the scale involved.

 

As remarkable as these figures may be, they pale in comparison when compared with estimates of the amount of energy artificial intelligence will require in the near future if its use becomes widespread among the general public. Something that’s looking very likely. Currently, the costs of setting up AI-supported servers are massive, but by 2027, global consumption of AI-related electricity could increase between 85 and 134TWh per year, according to some projections.

 

 

Finding an energy source for artificial intelligence

With these figures in mind, urgent solutions are required when it comes to generating the energy needed. Recently, Sergey Edunov, Director of Engineering for Generative AI at Meta, formerly Facebook, said that in order to meet the energy demand of their developments, “only” two nuclear reactors would be needed. Other giants in the industry, including the likes of Amazon, Google and Apple, are facing similar energy needs. In fact, these big names are already investing in nuclear energy to cover the energy needs of their AI innovations.

 

But what about the carbon footprint? In a perfect world, all the energy required to run AI would come from sources that don’t emit greenhouse gases, something that just isn’t happening today. For example, graphics cards, which need to work at full capacity when generating images, still aren’t being “fed” by clean energy sources. Data centres are already responsible for 1% of the world’s polluting emissions, a figure that will only get worse with the implementation of artificial intelligence innovations.

 

But a solution could be found in just two steps. Firstly, boost the use of renewable energies in data centres – the European Union aims to be carbon neutral by 2030, for example. Secondly, increase and improve the regulations around artificial intelligence. The aim would be for companies to be obliged to publicly disclose how much energy their developments use and the steps they’re taking to reduce that figure.



This site is registered on wpml.org as a development site. Switch to a production site key to remove this banner.