Neuromorphic Computing and Its Impact on AI Training
Category Machine Learning Sunday - September 10 2023, 03:20 UTC - 5 months ago The Max Planck Institute for the Science of Light in Erlangen, Germany, has developed a technique to efficiently train AI using physical processes instead of digital computers. Their research revealed that neuromorphic computing provides a more energy efficient way to train neural networks for AI applications. This would lead to reductions in energy expenditure and could potentially limit the strain on resources caused by increasing demand for AI systems.
Sunday - September 10 2023, 03:20 UTC - 5 months ago
The Max Planck Institute for the Science of Light in Erlangen, Germany, has developed a technique to efficiently train AI using physical processes instead of digital computers. Their research revealed that neuromorphic computing provides a more energy efficient way to train neural networks for AI applications. This would lead to reductions in energy expenditure and could potentially limit the strain on resources caused by increasing demand for AI systems.
Artificial intelligence not only affords impressive performance, but also creates significant demand for energy. The more demanding the tasks for which it is trained, the more energy it consumes.Víctor López-Pastor and Florian Marquardt, two scientists at the Max Planck Institute for the Science of Light in Erlangen, Germany, present a method by which artificial intelligence could be trained much more efficiently. Their approach relies on physical processes instead of the digital artificial neural networks currently used. The work is published in the journal Physical Review X.
The amount of energy required to train GPT-3, which makes ChatGPT an eloquent and apparently well-informed Chatbot, has not been revealed by Open AI, the company behind that artificial intelligence (AI). According to the German statistics company Statista, this would require 1,000 megawatt hours—about as much as 200 German households with three or more people consume annually. While this energy expenditure has allowed GPT-3 to learn whether the word "deep" is more likely to be followed by the word "sea" or "learning" in its data sets, by all accounts it has not understood the underlying meaning of such phrases.
Neural networks on neuromorphic computers .
In order to reduce the energy consumption of computers, and particularly AI-applications, in the past few years several research institutions have been investigating an entirely new concept of how computers could process data in the future. The concept is known as neuromorphic computing. Although this sounds similar to artificial neural networks, it in fact has little to do with them as artificial neural networks run on conventional digital computers. This means that the software, or more precisely the algorithm, is modeled on the brain's way of working, but digital computers serve as the hardware. They perform the calculation steps of the neuronal network in sequence, one after the other, differentiating between processor and memory.
"The data transfer between these two components alone devours large quantities of energy when a neural network trains hundreds of billions of parameters, i.e., synapses, with up to one terabyte of data," says Marquardt, director of the Max Planck Institute for the Science of Light and professor at the University of Erlangen.
The human brain is entirely different and would probably never have been evolutionarily competitive, had it worked with an energy efficiency similar to that of computers with silicon transistors. It would most likely have failed due to overheating. The brain is characterized by undertaking the numerous steps of a thought process in parallel and not sequentially. The nerve cells, or more precisely the synapses, are both processor and memory combined. Various systems around the world are being treated as possible candidates for the neuromorphic counterparts to our nerve cells, including photonic circuits utilizing light instead of electrons to perform calculations. Their components serve simultaneously as switches and memory cells.
A self-learning physical machine optimizes its synapses independently .
Together with López-Pastor, a researcher in light-based quantum computing, Marquardt was granted access to the AlphaOne quantum computer of D-Wave Systems in Vancouver, Canada. This type of computer uses miniature superconducting loop circuits for its calculation steps, thereby producing a physical implementation of a neuromorphic computer.
"This worked remarkably well, since we were able to measure a considerably higher energy efficiency with our quantum computer as compared to a classical digital computer, although its size and the available training data were identical," explains Marquardt.
The results of their experiments show that neuromorphic computing offers greater energy efficiency than digital computing when training a neural network. This could lead to significant reductions in the energy expenditure of AI-applications. However, the potential of neuromorphic computing has yet to be exhausted.