The Future is Neuromorphic Computing: DeepSouth Brings Brain Power to Supercomputers
Category Science Sunday - December 24 2023, 15:50 UTC - 11 months ago DeepSouth, a supercomputer set to go online in April 2024, is the world's first supercomputer capable of simulating networks of neurons and synapses that are similar to the human brain. Neuromorphic computing is modeled after and aims to unlock the secrets of the human brain's low power consumption. Transistor miniaturization is slowing down and halting due to its physical limitations, so scientists are exploring different approaches inspired by the human brain.
A supercomputer scheduled to go online in April 2024 will rival the estimated rate of operations in the human brain, according to researchers in Australia. The machine, called DeepSouth, is capable of performing 228 trillion operations per second.
It’s the world’s first supercomputer capable of simulating networks of neurons and synapses (key biological structures that make up our nervous system) at the scale of the human brain.
DeepSouth belongs to an approach known as neuromorphic computing, which aims to mimic the biological processes of the human brain. It will be run from the International Center for Neuromorphic Systems at Western Sydney University.
Our brain is the most amazing computing machine we know. By distributing its computing power to billions of small units (neurons) that interact through trillions of connections (synapses), the brain can rival the most powerful supercomputers in the world, while requiring only the same power used by a fridge lamp bulb.
Supercomputers, meanwhile, generally take up lots of space and need large amounts of electrical power to run. The world’s most powerful supercomputer, the Hewlett Packard Enterprise Frontier, can perform just over one quintillion operations per second. It covers 680 square meters (7,300 square feet) and requires 22.7 megawatts to run.
Our brains can perform the same number of operations per second with just 20 watts of power, while weighing just 1.3 to 1.4 kilograms. Among other things, neuromorphic computing aims to unlock the secrets of this amazing efficiency.
Transistors at the Limits .
On June 30, 1945, the mathematician and physicist John von Neumann described the design of a new machine, the Electronic Discrete Variable Automatic Computer (Edvac). This effectively defined the modern electronic computer as we know it.
My smartphone, the laptop I am using to write this article, and the most powerful supercomputer in the world all share the same fundamental structure introduced by von Neumann almost 80 years ago. These all have distinct processing and memory units, where data and instructions are stored in the memory and computed by a processor.
For decades, the number of transistors on a microchip doubled approximately every two years, an observation known as Moore’s Law. This allowed us to have smaller and cheaper computers.
However, transistor sizes are now approaching the atomic scale. At these tiny sizes, excessive heat generation is a problem, as is a phenomenon called quantum tunneling, which interferes with the functioning of the transistors. This is slowing down and will eventually halt transistor miniaturization.
To overcome this issue, scientists are exploring new approaches to computing, starting from the powerful computer we all have hidden in our heads, the human brain. Our brains do not work according to John von Neumann’s model of the computer. They don’t have separate computing and memory areas.
They instead work by connecting billions of nerve cells that communicate information in the form of electrical impulses. Information can be passed from one neuron to the next through a junction called a synapse. The organization and composition of networks of neurons and synapses is called the human brain's connectome, and understanding it could potentially revolutionize computing.
Share