New Learning Algorithm Able to Enhance AI Applications to be Integrated Into Mobiles

The high amount of energy consumed by artificial neural networks’ learning activities is among the biggest impediments for the general use of Artificial Intelligence (AI), more so in mobile applications.

One approach to solving this can be amassed from knowledge about the human brain. Although it has the computing power of a supercomputer, it only requires 20 watts, which is a millionth of the energy of a supercomputer.

One of the reasons for this is the effective transfer of information between neurons in the brain. Neurons send brief electrical impulses, or spikes, to other neurons, but to save energy, they do it as often as absolutely necessary.

Information Processing Based on Events

A working team led by two computer scientists, Wolfgang Maass and Robert Legenstein of TU Graz, has employed this principle in the creation of the new machine learning algorithm e-propagation (e-prop).

Scientists at the Institute of Theoretical Computer Science, which is also part of the European key enterprise Human Brain Project, used spikes in their model for communication between neurons in an artificial neural network.

The spikes only become energetic when they are required for information processing the array. Learning is a specific challenge for such less active networks since it needs longer observations to figure out which neuron connections enhance network performance.

Earlier techniques manage to achieve too little success or required massive storage space. E-prop now can solve this problem through a decentralized method copied from the brain, in which each neuron registers when its connections were used in a so-called e-trace, or eligibility trace.

TU Graz computer scientists Robert Legenstein and Wolfgang Maass (from left) are working on energy-efficient AI systems based on the way the human brain functions. [Image: Lunghammer—TU Graz]
The method is about as powerful as the best and most intricate other known learning methods. A paper detailing the method has been published in the scientific journal Nature Communications.

Online Rather Than Offline

With most of the machine learning techniques currently employed, all network activities are placed centrally and offline in order to manage every few steps on how the connections were used throughout the calculations.

Still, this needs a constant data transfer between the memory and the processors, which is one of the key reasons for the unreasonable energy consumption of current AI technology. E-prop, however, functions entirely online and doesn’t require separate memory even in real operation, therefore, making learning much more energy effective.

The goal of the scientists is to not have the computing systems leans energy-intensively only via a cloud anymore but to effectively include the greater part of the learning capacity into mobile hardware elements and, therefore, save energy.

The first steps to bring e-prop into applications have already been performed. For instance, the TU Graz team is collaborating with the Advanced Processor Technologies Research Group (APT) of the University of Manchester in the Human Brain Project to incorporate e-prop into the neuromorphic SpiNNaker system, which has been created there.

Simultaneously, the TU Graz team is working with scientists from the semiconductor manufacturer Inter to integrate the algorithm into the forthcoming version of the company’s neuromorphic chip Loihi.

This research work is based in the Fields of Expertise ‘Human and biotechnology’ and ‘Information, Communication & Computing,’ two of the five Fields of Expertise of TU Graz.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *