July 22, 2025 - NTT Research and NTT R&D presented 12 groundbreaking papers at ICML 2025, addressing critical challenges in AI development. Their work includes findings on why algorithms improving LLM accuracy can negatively impact factual recall and reasoning abilities, alongside a revolutionary 'Portable Tuning' technology to reduce retraining costs. These advancements aim to enhance AI sustainability while maintaining performance.
The Physics of AI Group identified unintended consequences of optimising large language models (LLMs) for factual accuracy, revealing trade-offs between precision and reasoning capabilities. Meanwhile, NTT R&D's Portable Tuning technology enables efficient model adaptation across different tasks without full retraining, addressing environmental and financial sustainability concerns. NTT Research highlighted the importance of these innovations in creating more responsible AI systems.
These developments align with global efforts to address AI's environmental footprint and improve model efficiency. The Portable Tuning approach directly challenges the 'bigger is better' paradigm in AI development, advocating for resource-conscious solutions. This shift reflects broader industry trends toward sustainable AI infrastructure, particularly as energy costs and regulatory pressures mount.
Our view: NTT's work exemplifies the critical need for sustainable AI innovation. By prioritising efficiency and ethical considerations, their research sets a benchmark for responsible development that balances technological progress with environmental stewardship.
Be the first to comment!