The Hardware Requirements for Future AI Advances
Category Machine Learning Wednesday - November 8 2023, 07:53 UTC - 1 year ago The world of computing is edging closer to the world of artificial intelligence, but the hardware requirements needed are still a challenge. Researchers from Purdue University, University of California San Diego (USCD) and École Supérieure de Physique et de Chimie Industrielles (ESPCI) in Paris, France, have found a new way to rework hardware by mimicking the synapses of the human brain using vanadium oxides. Neuromorphic architectures require lower energy consumption and have the potential for enhancing computation, native learning and pattern recognition.
Technology is edging closer and closer to the super-speed world of computing with artificial intelligence. But is the world equipped with the proper hardware to be able to handle the workload of new AI technological breakthroughs? "The brain-inspired codes of the AI revolution are largely being run on conventional silicon computer architectures, which were not designed for it," explains Erica Carlson, 150th Anniversary Professor of Physics and Astronomy at Purdue University.
In a joint effort among physicists from Purdue University, University of California San Diego (USCD) and École Supérieure de Physique et de Chimie Industrielles (ESPCI) in Paris, France, the researchers believe they may have discovered a way to rework the hardware by mimicking the synapses of the human brain. They have published their findings, "Spatially Distributed Ramp Reversal Memory in VO2," in Advanced Electronic Materials.
New paradigms in hardware will be necessary to handle the complexity of tomorrow's computational advances. According to Carlson, lead theoretical scientist of this research, "neuromorphic architectures hold promise for lower energy consumption processors, enhanced computation, fundamentally different computational modes, native learning and enhanced pattern recognition." .
Neuromorphic architecture basically boils down to computer chips mimicking brain behavior. Neurons are cells in the brain that transmit information. Neurons have small gaps at their ends that allow signals to pass from one neuron to the next which are called synapses. In biological brains, these synapses encode memory. This team of scientists concludes that vanadium oxides show tremendous promise for neuromorphic computing because they can be used to make both artificial neurons and synapses.
In this video, Carlson and Zimmers speak about the exciting field of neuromorphic quantum materials. Credit: Quantum Coffee House .
"The dissonance between hardware and software is the origin of the enormously high energy cost of training, for example, large language models like ChatGPT," explains Carlson. "By contrast, neuromorphic architectures hold promise for lower energy consumption by mimicking the basic components of a brain: neurons and synapses. Whereas silicon is good at memory storage, the material does not easily lend itself to neuron-like behavior.
Ultimately, to provide efficient, feasible neuromorphic hardware solutions requires research into materials with radically different behavior from silicon—ones that can naturally mimic synapses and neurons. Unfortunately, the competing design needs of artificial synapses and neurons mean that most materials that make good synaptors fail as neuristors, and vice versa. Only a handful of materials, most of them quantum materials, have the demonstrated ability to do both." .
The team relied on a recently discovered type of non-volatile memory which is driven by repeated partial temperature cycling through the insulator-to-metal transition. This memory was discovered in vanadium oxides.
Alexandre Zimmers, lead experimental scientist from Sorbonne University and École Supérieure de Physique et de Chimie Industrielles, explains, "We provided evidence that a ramp reversal memory can be provided in VO2 which can be used in a variety of neuromorphic applications, as the material is metastable and is allowing for a few reshifts and non-Hamiltonian behavior." .
Share