Tapping Into the Power of the Human Brain with Spiking Neural Networks
Category Science Saturday - June 17 2023, 02:45 UTC - 1 year ago Spiking Neural Networks (SNNs) are brain-inspired computing systems that leverage co-integration of optoelectronic neurons, analog electrical circuits and Mach-Zehnder interferometer meshes to mimic the way real neurons process information. This approach offers potential advantages in performance and speed over traditional neural networks and machine learning systems.
Perfect recall, computational wizardry and rapier wit: That's the brain we all want, but how does one design such a brain? The real thing is comprised of ~80 billion neurons that coordinate with one another through tens of thousands of connections in the form of synapses. The human brain has no centralized processor, the way a standard laptop does.Instead, many calculations are run in parallel, and outcomes are compared. While the operating principles of the human brain are not fully understood, existing mathematical algorithms can be used to rework deep learning principles into systems more like a human brain would. This brain-inspired computing paradigm—spiking neural networks (SNN)—provides a computing architecture well-aligned with the potential advantages of systems using both optical and electronic components.
In SNNs, information is processed in the form of spikes or action potentials, which are the electrical impulses that occur in real neurons when they fire. One of their key features is that they use asynchronous processing, meaning that spikes are processed as they occur in time, rather than being processed in a batch like in traditional neural networks. This allows SNNs to react quickly to changes in their inputs, and to perform certain types of computations more efficiently than traditional neural networks.
SNNs are also able to implement certain types of neural computation that are difficult or impossible to implement in traditional neural networks, such as temporal processing and spike-timing-dependent plasticity (STDP), which is a form of Hebbian learning that allows neurons to change their synaptic connections based on the timing of their spikes. (Hebbian learning is summarized as "Cells that fire together wire together." It lends itself to math that models the plasticity of the brain's learning capacity.) .
A recently published paper in the IEEE Journal of Selected Topics in Quantum Electronics describes the development of an SNN device leveraging the co-integration of optoelectronic neurons, analog electrical circuits, and Mach-Zehnder Interferometer meshes. These meshes are optical circuit components that can perform matrix multiplication, similar to the way synaptic meshes operate in the human brain.
The authors showed that optoelectronic neurons can accept input from an optical communication network, process the information through analog electrical circuits, and communicate back to the network through a laser. This process allows for faster data transfer and communication between systems than traditional electronic-only systems.
The paper also describes the use of existing algorithms, such as Random Backpropagation and Contrastive Hebbian Learning, to create brain-inspired computing systems. These algorithms allow the system to learn from information local to each synapse much like the human brain would, providing significant advantages in computing performance over traditional machine learning systems that use backpropagation.
In relation to AI and machine learning, SNNs provide several advantages over modern computing paradigms for tasks that mimic the conditions in which they natually occur. By taking a "brain-inspired" approach, these systems can better mimic the way the human brain processes information and sets of data, resulting in more natural outcomes.
Share