
Methods to make AI faster, greener, and more accessible for everyone.
Artificial intelligence is becoming a part of everyday life, powering tools from language translation to speech recognition and healthcare systems. However, training the large AI models that make these tools possible consumes large amounts of computing power, memory, and energy. This raises important questions about efficiency, cost, and sustainability.
To tackle these challenges, PhD researcher Qiao Xiao explored how AI models can focus only on the most relevant parts during training, instead of using every part of the model and all available data. This approach, called dynamic sparsity, helps reduce resource use without sacrificing performance. He defended his PhD thesis on Thursday, September 11.
Saving resources at every level
To overcome the computational and energy challenges of AI, Qiao Xiao approached the problem on three fronts: making the model itself more efficient, using data more wisely, and reducing communication overhead.
[....]