Innovations in Spiking Neural Networks: Making AI More Responsive and Energy-Efficient

Category Computer Science

tldr #

In their new study in Nature Machine Intelligence, Bojian Yin and Sander Bohté from CWI demonstrate a major step forward in developing artificial intelligence programs that are more responsive and energy-efficient. By using brain-like neurons and their novel learning algorithm, they are able to train a spiking neural network of 6 million neurons, which is much larger than any other that has been trained before.


content #

In a new study in Nature Machine Intelligence, researchers Bojian Yin and Sander Bohté from the HBP partner Dutch National Research Institute for Mathematics and Computer Science (CWI) demonstrate a significant step towards artificial intelligence that can be used in local devices like smartphones and in VR-like applications, while protecting privacy.

They show how brain-like neurons combined with novel learning methods enable training fast and energy-efficient spiking neural networks on a large scale. Potential applications range from wearable AI to speech recognition and Augmented Reality.

Spiking neural networks work using electrical pulses to activate neurons.

While modern artificial neural networks are the backbone of the current AI revolution, they are only loosely inspired by networks of real, biological neurons such as our brain. The brain however is a much larger network, much more energy-efficient, and can respond ultra-fast when triggered by external events. Spiking neural networks are special types of neural networks that more closely mimic the working of biological neurons: the neurons of our nervous system communicate by exchanging electrical pulses, and they do so only sparingly.

Neuromorphic hardware is the technological embodiment of spiking neural networks on chips.

Implemented in chips, called neuromorphic hardware, such spiking neural networks hold the promise of bringing AI programs closer to users—on their own devices. These local solutions are good for privacy, robustness and responsiveness. Applications range from speech recognition in toys and appliances, health care monitoring and drone navigation to local surveillance.

Just like standard artificial neural networks, spiking neural networks need to be trained to perform such tasks well. However, the way in which such networks communicate poses serious challenges. "The algorithms needed for this require a lot of computer memory, allowing us to only train small network models mostly for smaller tasks. This holds back many practical AI applications so far," says Sander Bohté of CWI's Machine Learning group. In the Human Brain Project, he works on architectures and learning methods for hierarchical cognitive processing.

The current learning methods of spiking neural networks require a lot more energy and computing power.

--- Mimicking the learning brain --- .

The learning aspect of these algorithms is a big challenge, and they cannot match the learning ability of our brain. The brain can easily learn immediately from new experiences, by changing connections, or even by making new ones. The brain also needs far fewer examples to learn something and it works more energy-efficiently. "We wanted to develop something closer to the way our brain learns," says Bojian Yin.

The new online learning algorithm developed by the study enables spiking neural networks to be larger in scale.

Yin explains how this works: if you make a mistake during a driving lesson, you learn from it immediately. You correct your behavior right away and not an hour later. "You learn, as it were, while taking in the new information. We wanted to mimic that by giving each neuron of the neural network a bit of information that is constantly updated. That way, the network learns how the information changes and doesn't have to remember all the previous information. This is the big difference from current networks, which have to work with all the previous changes. The current way of learning requires enormous computing power and thus a lot of memory and energy." .

The new algorithm is closer to mimic the brain's natural learning behavior.

--- 6 million neurons --- .

The new online learning algorithm makes it possible to learn directly from the data, enabling much larger spiking neuiral networks and at the same time reducing findel complexity and energy. With his team, Yin trained a network of six million neurons using the new algorithm. Yin: "This is much larger than any spiking neural network that has been trained before, and it is a big step forward in understanding how the brain learns fast and energy-efficiently." .

The new algorithm was able to successfully train a network of 6 million neurons.

hashtags #
worddensity #

Share