The use of artificial intelligence, particularly generative models like ChatGPT, is sparking a debate about its environmental impact. While some believe that AI consumes an alarming amount of energy, others suggest that the ecological footprint might be more moderate than expected. This article explores the energy implications of artificial intelligence and the often misleading comparisons that circulate.
The 3 key points not to miss
- The estimated energy consumption of an average query to ChatGPT is 0.34 Wh, contrary to exaggerated claims.
- AI image generation consumes significantly more energy than text generation.
- The industrial-scale environmental impact remains concerning despite the low individual footprint.
The actual energy consumption of ChatGPT
The energy consumption of ChatGPT has been exaggeratedly compared to that of a light bulb being on for several hours, or even an entire day. In reality, an average query consumes about 0.34 Wh, equivalent to a few minutes of lighting with an efficient LED bulb. These figures demonstrate that the individual impact is minimal.
The confusion often arises from inaccurate comparisons. Consumption patterns vary depending on the complexity of the queries and the type of content generated. For texts, the energy used is far less than that required to generate images, which require intensive processing by graphics cards.
Comparison with other AI and technologies
The figures provided by Google for its chatbot Gemini indicate a consumption of 0.24 Wh per text query, which is consistent with ChatGPT’s assessments. This highlights that, although AI may seem energy-intensive, technological advances and optimizations have reduced its energy footprint.
Furthermore, research conducted by independent institutions, such as Epoch AI, confirms that the energy impact of text queries is largely lower than often reported. Current AI models are designed to be more efficient, both in terms of computation and energy consumption.
Image and video generation: an energy challenge
AI image generation consumes on average 60 times more energy than text generation. This difference is explained by the complex process of noise addition and removal necessary to create visuals. However, even in this case, initial estimates of consumption equivalent to a day of lighting are exaggerated.
Generative video, although less developed, also raises energy questions. Early assessments suggest that its consumption could be high, but precise data is still lacking to draw definitive conclusions.
Impact on an individual and industrial scale
Hannah Ritchie, a researcher specializing in the environment, estimates that individual use of ChatGPT has a negligible energy impact. However, on an industrial scale, where AI usage is multiplying, the impact becomes more concerning.
Data centers, essential for AI operations, represent a significant portion of overall energy consumption. The capacity of power grids to absorb this growing load is one of the challenges for the future, especially as AI applications continue to develop.
Context: OpenAI and the evolution of ChatGPT
OpenAI, the company behind ChatGPT, was founded in 2015 with the mission to promote and develop beneficial artificial intelligence for humanity. Since the launch of ChatGPT, the model has undergone several iterations, with notable improvements in terms of efficiency and performance.
The GPT-4 model, in service in 2023, was followed by 19 other versions. These developments have reduced the energy footprint of queries while increasing the model’s ability to respond to complex queries. OpenAI continues to innovate, seeking to balance technological advances with environmental concerns.