Haley Zaremba
Haley Zaremba is a writer and journalist based in Mexico City. She has extensive experience writing and editing environmental features, travel pieces, local news in the…
More Info
Premium Content
By Haley Zaremba – Oct 17, 2024, 4:00 PM CDT
The increasing energy demands of AI pose a significant challenge to sustainability and decarbonization goals.
Researchers have discovered a new integer-addition algorithm that could reduce AI’s energy footprint by 95%.
The tech sector is exploring various solutions, including clean energy sources and more efficient computing methods, to address AI’s energy consumption problem.
The use of Artificial Intelligence is rising rapidly as machine learning and large language models become a mainstream aspect of everyday applications. While the applications of AI spike, so too do their energy footprints and associated greenhouse gas emissions. Training AI models requires huge amounts of computing energy, to the effect that energy demand in developed countries has increased in 2024 after years of plateaued growth.
While AI holds enormous promise for building more efficient, stable, and smart energy grids, it also poses a major threat to our energy security and decarbonization goals. As the size and reach of the sector skyrocket, the computational power necessary to sustain the growth of AI is doubling approximately every 100 days. Currently, ChatGPT requires around 564 MWh each day, which would be enough energy to power 18,000 homes in the United States. With this scale and speed of growth, it’s unclear where countries like the United States will source enough energy to meet the rapidly expanding demands of the tech sector, much less do so in a climate-friendly manner.
As a result, the tech sector is scrambling to discover new sources of clean energy. Sam Altman of OpenAI, the company behind ChatGPT, has been an outspoken proponent of increased investment into nuclear fission power production as well as nuclear fusion research and development to power AI’s energy needs. “The AI systems of the future will need tremendous amounts of energy and this fission and fusion can help deliver them,” Altman was quoted in the Wall Street Journal last year. Bill Gates, too, has pledged billions toward nuclear energy investments to help clean up the tech sector, and keep it clean as data centers continue to proliferate.
In addition to the attention toward increased clean energy production, many researchers and scientists are experimenting with ways to make AI more energy-efficient. These largely revolve around different ways of computing which require less energy per calculation. One such solution is the potential application of quantum computing for AI, which would allow large language models to perform highly complex computations faster and with far fewer resources. In certain cases, quantum computers could be 100 times more energy efficient than current supercomputers.
But there is another potential computational intervention that would be far simpler and more realistic to implement in the near term, as quantum computing is still more theoretical than applicable in critical domains. A team of engineers at BitEnergy AI, an AI inference technology company, has discovered that a novel integer-addition algorithm could reduce AI’s energy footprint by a whopping 95%. Their findings were published in a scientific paper this month through Cornell University.
“The new technique is basic—instead of using complex floating-point multiplication (FPM), the method uses integer addition,” Tech Xplore recently reported. “Apps use FPM to handle extremely large or small numbers, allowing applications to carry out calculations using them with extreme precision. It is also the most energy-intensive part of AI number crunching.”
This new technological breakthrough cannot be implemented soon enough. AI is expected to represent 3.5 percent of the global electricity consumption by 2030. Together with electric vehicles, AI is on track to add 290 terawatt hours of electricity demand to the United States energy grid over the same period to reach the same level of energy consumption as the entire country of Turkey, the world’s 18th largest economy, according to projections by Rystad Energy.
“When you look at the numbers, it is staggering,” Jason Shaw, chairman of the Georgia Public Service Commission, a U.S. electricity regulator, told the Washington Post earlier this year. “It makes you scratch your head and wonder how we ended up in this situation. How were the projections that far off? This has created a challenge like we have never seen before.”
Thankfully, it seems that researchers are rising to this challenge, and the way that we run large language models could soon be far more energy-efficient without compromising performance.
By Haley Zaremba for Oilprice.com
More Top Reads From Oilprice.com
Uzbekistan Faces Uphill Battle in WTO BidRussia and India Forge Arctic AllianceKazakhstan’s Major Nuclear Power Dilemma
Download The Free Oilprice App Today
Back to homepage
Haley Zaremba
Haley Zaremba is a writer and journalist based in Mexico City. She has extensive experience writing and editing environmental features, travel pieces, local news in the…
More Info
Related posts
Leave a comment