Artificial Intelligence (AI) is currently the most prominent field in the IT industry. AI excels at drawing pictures, writing texts, and finding and summarizing the information users want. In semiconductor design and new drug development, which require long-term research by experts, AI can even provide solutions humans have not considered.
However, using AI comes at a significant cost. High-performance chipsets are essential for performing complex calculations, requiring considerable energy. Training AI models with hundreds of thousands of data also consumes energy. Therefore, using AI conveniently requires a substantial amount of power.
The power consumed by AI could rival that of a country
According to Schneider Electric, a global energy solutions company, the power consumed to run AI functions in countries excluding China reaches 4.3 gigawatt-hours (GWh). They warned that the estimated consumption could reach 20 GWh by 2028. Dr. Alex de Vries from Vrije Universiteit Amsterdam claimed in a report published in the scientific journal ‘Joule’ that using AI once consumes as much power as leaving an LED light bulb on for an hour.
Data centers handling AI operations also consume a significant amount of power. The famous AI service ChatGPT receives an average of about 200 million AI operation requests daily. According to Dr. Alex, using AI consumes enough energy to keep 200 million LED light bulbs daily for an hour. As more fields introduce AI, consumption will undoubtedly increase steeply. Dr. de Vries estimated that by around 2027, AI data centers worldwide would consume around 100 terawatt-hours (TWh) of power annually, equivalent to the annual power consumption of Sweden, the Netherlands, or Argentina.
Big Tech companies ‘losing money’… Searching for ways to improve energy efficiency
As power consumption increases, the burden on companies providing AI services also grows. On October 9 (local time), the Wall Street Journal (WSJ) reported that Microsoft (MS) is losing about $20 per user per month from its AI service GitHub Copilot, which launched last year. It added that active users of AI features are causing Microsoft to lose around $80 per month.
In response, companies operating AI services, government departments, and research teams in various fields are considering how to solve the problem of excessive energy consumption by AI. To cope with the massive energy consumption of AI, they are exploring ways to increase energy production and investigating if they can use AI in this process. On the other hand, as it is impossible to increase energy production infinitely, they are also looking for ways to save the power needed to run AI features.
In May this year, Microsoft signed a contract to purchase power from Helion Energy, a company that generates nuclear energy through nuclear fusion, until 2028. NotebookCheck, an overseas PC media outlet that reported the news, stated that Microsoft plans to use its small modular reactors (SMRs) to meet the power consumed in AI data centers. As part of this plan, it has posted job advertisements for nuclear technology program managers.
According to a report by the American online media Utility Dive on October 23, Jeff Duncan, a member of the South Carolina House of Representatives, asked industry insiders for suggestions to improve the operation of the oil, gas production, and nuclear sectors using AI.
In response, Edward Abbo, President and CTO of the enterprise AI software development company C3.ai, said that AI could increase the production of oil and gas companies. He argued that monitoring on-site assets such as pipelines with AI could reduce the energy needed for the entire production process.
Paul Dabbar, CEO of ‘Boehr Quantum Technologies,’ claimed installing sensors on power generator turbines and circuit breakers could improve performance. Over time, as generator operation data accumulates, AI predicts when maintenance is needed based on this information. This leads to increased generator availability and the ability to produce more power for longer periods. He added that some power plants have already adopted this technology.
A research team from Northwestern University in the U.S. announced a nano-electronic device that saves energy consumption in the Nature Electronics Journal. The team claimed that by changing the material of the transistor used in AI tasks from silicon to molybdenum disulfide and carbon nanotubes, power efficiency improved by about 100 times.
The research team said the new transistor is tiny and consumes less power, making it suitable for wearable devices. Meanwhile, existing AI technology consumes a lot of energy and has a long delay because it processes parts that require calculations on the cloud server and then transfers them back to the device. However, the technology developed by the research team is energy-efficient and can implement AI features within the device without a cloud connection. It saves energy and works in environments where network use is impossible. Storing personal information only on the device eliminates the worry of leakage.
When you care about energy saving… Avoid habitual AI overuse
As various fields apply AI, energy consumption is increasing excessively. This has led to calls to avoid using AI technology indiscriminately. The argument is to avoid forcing the application of AI in areas where it is unnecessary or overusing AI unnecessarily.
For example, for users who want to search for keywords on web search services, the search result summary feature using AI is just a feature that wastes energy unnecessarily. On the other hand, experts have urged reducing the habit of creating texts or images with generative AI unless necessary. Occasionally, some users repeatedly create images they will not use, fascinated by the novelty of the image creation AI. Considerable energy is wasted each time these users press the create button. Eliminating such cases alone would be a great help in reducing energy consumption.
By. Lee Byung Chan
Most Commented