Advertisement

Surrey ‘brain-inspired AI’ cuts energy by 99%

In testing, the groundbreaking approach to developing artificial neural networks matched or exceeded the accuracy of established models. 

Developed by the University of Surrey’s Nature-Inspired Computation and Engineering group, results of initial trials using Topographical Sparse Mapping (TSM) were first published in Neurocomputing

This approach essentially involves rewiring the neurological framework used by generative and other modern AI systems. So, whereas Chat GPT and other leading platforms connect each neuron in one layer to all other in the next, the revolutionary approach significantly cuts the number of connections, leaving only a single connection between neutrons which are related or located next to each other.

A more advanced version takes this even further, mimicking the brain’s gradual process of refining neural neurological connections as it learns, improving accuracy and efficiency. Transferred to the AI world, this is known as Enhanced Topographical Sparse Mapping, this delivers even better results. Overall, the Surrey team have achieved 99% sparsity compared with established artificial intelligence models, using less than 1% of the average energy consumption. 

‘Training many of today’s popular large AI models can consume over a million kilowatt-hours of electricity, which is equivalent to the annual use of more than a hundred US homes, and cost tens of millions of dollars,’ said Dr Roman Bauer, Senior Lecturer at the University of Surrey’s School of Computer Science and Electronic Engineering. ‘That simply isn’t sustainable at the rate AI continues to grow. Our work shows that intelligent systems can be built far more efficiently, cutting energy demands without sacrificing performance.’

‘The brain achieves remarkable efficiency through its structure, with each neuron forming connections that are spatially well-organised,’ added Mohsen Kamelian Rad, a PhD student at the University of Surrey and lead author of the study. ‘When we mirror this topographical design, we can train AI systems that learn faster, use less energy and perform just as accurately. It’s a new way of thinking about neural networks, built on the same biological principles that make natural intelligence so effective.

Image: notorious v1ruS / Unsplash 

More Technology: 

Life sciences boost for Liverpool City Region

UK economy ‘bleeding out’ over failure to scale homegrown tech

South Hams District Council begins water monitoring with homegrown tech

 

Help us break the news – share your information, opinion or analysis
Back to top