New Method Cuts Energy Use in Training Language Models by 30%
Researchers at the University of Michigan have developed an innovative software tool, Perseus, which reduces energy consumption during the training of large language models by 30%. This breakthrough could potentially power 1.1 million households by 2026, highlighting the importance of energy efficiency in AI development.
DAMN
0