Islamabad, Feb 12: A new study is challenging previous estimates about the energy consumption of ChatGPT, revealing that the AI chatbot uses much less power than initially believed. The analysis, conducted by Epoch AI, a nonprofit AI research group, shows that a typical ChatGPT query consumes approximately 0.3 watt-hours, a stark contrast to the widely cited 3 watt-hours per query. This new figure suggests that ChatGPT is not as power-hungry as previously thought, and is actually more energy-efficient than many everyday activities, such as using household appliances or driving a car.

Joshua You, a data analyst at Epoch AI, explained to TechCrunch that the lower power consumption was due to improvements in AI hardware, specifically noting that OpenAI had shifted to more energy-efficient chips. The 3-watt-hour figure, often used in earlier studies, was based on outdated assumptions and no longer reflects the current energy usage of ChatGPT. However, You also cautioned that these figures are still approximations since OpenAI has not disclosed full details on the power consumption of ChatGPT.

Despite these positive findings, concerns about the broader environmental impact of AI continue to grow. Recently, over 100 organizations signed an open letter urging AI companies and regulators to address the rising energy demands of AI data centers. These centers, which power AI models like ChatGPT, could place significant strain on power grids and increase reliance on nonrenewable energy sources.

READ MORE: Roblox Players Face 1.6 Million Cyberattacks in 2024

Looking ahead, the energy demands of AI are expected to rise as the models become more complex. A report from Rand suggests that by 2027, AI data centers could consume nearly all of California’s 2022 power capacity, and by 2030, training advanced AI models might require as much electricity as eight nuclear reactors. In response, companies like OpenAI, along with partners such as Microsoft, Oracle, and SoftBank, are investing billions to expand AI infrastructure.

While OpenAI has been working on developing more energy-efficient models, such as GPT-4o-mini, the growing adoption of AI globally may still lead to increased energy consumption. Experts suggest that users concerned about their energy impact may want to limit AI usage or choose smaller, less power-intensive models to help balance growth with energy efficiency.

Share.
Leave A Reply Cancel Reply
Exit mobile version