Skip to main content

New Method Cuts Energy Use in Training Language Models by 30%

Researchers Develop Energy-Efficient Training Method for AI Models

A recent study conducted by researchers at the University of Michigan has unveiled an energy-efficient method for training large language models. This new approach can achieve the same results in the same amount of time while reducing energy consumption by 30%. The implications of this discovery are significant, as the energy savings could potentially power 1.1 million American households by 2026.

Introduction of Perseus

The innovative software tool, named Perseus, identifies what is known as the critical path—a sequence of sub-tasks that take the longest to complete during the training process. By strategically managing processor speed on non-critical paths, Perseus allows for simultaneous task completion, effectively eliminating unnecessary power consumption.

image

Testing Perseus

In their research, the team applied Perseus to train several AI models, including GPT-3, three additional large language models, and a computer vision model. The results of these tests demonstrated that Perseus successfully reduced the energy consumption associated with AI training while maintaining the same training speed as traditional methods.

Impact on AI Accessibility

The researchers emphasized the importance of their energy-saving method for ensuring equitable access to artificial intelligence. In regions where electricity is limited, countries may be forced to rely on remote services or to utilize smaller, less accurate models. This situation could exacerbate existing disparities in access to AI technology across different communities.

Significance of Energy Optimization

The findings of this study underscore the potential for optimizing AI training methods to not only conserve energy but also to maintain efficiency. This is a critical consideration as the demand for large language models continues to grow, particularly in a world increasingly focused on reducing carbon footprints and enhancing sustainability.

Conclusion

As the field of artificial intelligence evolves, the development of tools like Perseus represents a promising shift towards more sustainable practices. By addressing energy consumption concerns, researchers are paving the way for a future where AI technology is accessible and environmentally responsible.

Key Points

  1. The University of Michigan has developed Perseus, reducing AI training energy consumption by 30%.
  2. This method could save energy equivalent to powering 1.1 million households by 2026.
  3. Perseus maintains training speed while optimizing processor use.
  4. Energy efficiency in AI is crucial for equitable access across communities.
  5. The study highlights the importance of reducing carbon footprints in technology development.

Enjoyed this article?

Subscribe to our newsletter for the latest AI news, product reviews, and project recommendations delivered to your inbox weekly.

Weekly digestFree foreverUnsubscribe anytime

Related Articles

News

Tsinghua-Backed AI Firm Mianshi Intelligence Lands Major Funding Boost

Chinese AI startup Mianshi Intelligence has secured hundreds of millions in new funding led by China Telecom, signaling growing confidence in its Tsinghua-developed large language models. The company's MiniCPM series, known for efficient edge computing performance, is finding commercial applications across multiple industries as China's AI sector matures.

February 28, 2026
Artificial IntelligenceChina TechLarge Language Models
News

Meizu Shifts Focus from Smartphones to AI Amid Rising Costs

Chinese smartphone maker Meizu has announced it will halt domestic smartphone R&D due to soaring memory prices, marking a strategic pivot towards AI development. The company plans to deepen its partnership with Geely Automotive while maintaining overseas phone operations and existing product lines.

February 27, 2026
smartphonesAIbusiness strategy
News

Silicon Valley's AI Talent Wars Heat Up as OpenAI Snags Meta's Star Researcher

The battle for top AI talent reached new heights this week as OpenAI successfully recruited renowned researcher Ruoming Pang from Meta. Despite Meta's reported $200 million compensation package, Pang chose to join Sam Altman's team after months of courtship. This high-profile move highlights the intense competition among tech giants for experts who can drive breakthroughs in artificial general intelligence.

February 26, 2026
AISiliconValleyTechTalent
Tencent's AI Assistant Caught Swearing in Holiday Messages
News

Tencent's AI Assistant Caught Swearing in Holiday Messages

Tencent's AI assistant Yuanbao sparked outrage after generating New Year greeting images with profanity instead of festive wishes. Users reported similar incidents earlier this year where the AI responded with personal insults during coding help requests. The company apologized, calling it an 'uncommon abnormal output,' while experts warn this exposes fundamental challenges in controlling large language models.

February 25, 2026
AI EthicsLarge Language ModelsTech Controversy
News

AI Industry Sees Staggering Growth as OpenAI Hits $850B Valuation

The AI sector is experiencing unprecedented growth, with OpenAI's valuation skyrocketing to $850 billion in just six months. Meanwhile, India's corporate giants are making a massive $1.45 trillion bet on AI infrastructure development. While these numbers paint a picture of explosive expansion, challenges remain in turning these investments into sustainable technological leadership.

February 24, 2026
AIOpenAITechInvestment
Musk's Bold Claim: AI Could Make Traditional Programming Obsolete
News

Musk's Bold Claim: AI Could Make Traditional Programming Obsolete

Elon Musk has sparked debate with his latest prediction - that AI will soon write binary code directly, potentially making traditional programming languages obsolete. As major tech firms race to develop AI coding assistants, the industry faces a pivotal moment. While some fear for programmers' jobs, experts suggest the role will evolve rather than disappear entirely in this $2.6 billion market transformation.

February 16, 2026
AIProgrammingTech Innovation