Sustainable AI: Exploring Sustainable Alternatives For Powering The Future Of Generative AIs
Category Artificial Intelligence Sunday - August 27 2023, 22:33 UTC - 1 year ago Generative AIs are highly energy consuming systems, and their energy demand threatens their development. To address this issue, IBM recently created an energy-efficient 14 nm analog chip, which can be used to power smart phones and other electronic devices. Other alternative technologies such as brain-inspired computing and photonic computing are also being studied as potential sustainable solutions for powering the future of AI.
ChatGPT, DALL-E, Stable Diffusion, and other generative AIs have taken the world by storm. They create fabulous poetry and images. They’re seeping into every nook of our world, from marketing to writing legal briefs and drug discovery. They seem like the poster child for a man-machine mind meld success story.
But under the hood, things are looking less peachy. These systems are massive energy hogs, requiring data centers that spit out thousands of tons of carbon emissions—further stressing an already volatile climate—and suck up billions of dollars. As the neural networks become more sophisticated and more widely used, energy consumption is likely to skyrocket even more.
Plenty of ink has been spilled on generative AI’s carbon footprint. Its energy demand could be its downfall, hindering development as it further grows. Using current hardware, generative AI is "expected to stall soon if it continues to rely on standard computing hardware," said Dr. Hechen Wang at Intel Labs.
It’s high time we build sustainable AI.
This week, a study from IBM took a practical step in that direction. They created a 14-nanometer analog chip packed with 35 million memory units. Unlike current chips, computation happens directly within those units, nixing the need to shuttle data back and forth—in turn saving energy.
Data shuttling can increase energy consumption anywhere from 3 to 10,000 times above what’s required for the actual computation, said Wang.
The chip was highly efficient when challenged with two speech recognition tasks. One, Google Speech Commands, is small but practical. Here, speed is key. The other, Librispeech, is a mammoth system that helps transcribe speech to text, taxing the chip’s ability to process massive amounts of data.
When pitted against conventional computers, the chip performed equally as accurately but finished the job faster and with far less energy, using less than a tenth of what’s normally required for some tasks.
"These are, to our knowledge, the first demonstrations of commercially relevant accuracy levels on a commercially relevant model…with efficiency and massive parallelism" for an analog chip, the team said.
Brainy Bytes .
This is hardly the first analog chip. However, it pushes the idea of neuromorphic computing into the realm of practicality—a chip that could one day power your phone, smart home, and other devices with an efficiency near that of the brain.
Um, what? Let’s back up.
Current computers are built on the Von Neumann architecture. Think of it as a house with multiple rooms. One, the central processing unit (CPU), analyzes data. Another stores memory.
For each calculation, the computer needs to shuttle data back and forth between those two rooms, and it takes time and energy and decreases efficiency.
The brain, in contrast, combines both computation and memory into a studio apartment. Its mushroom-like junctions, called synapses, both form neural networks and store memories at the same location. Synapses are highly flexible, adjusting how strongly they connect with other neurons based on stored memory and new learnings—a property called "weights." Our brains quickly adapt to new environments, thanks to this feature.
This is where neuromorphic computing comes in. Think of it as a room combining the CPU and memory. In order to conserve energy, these chips use their synapses like humans do—adjusting weights and establishing connections in one go.
These chips are equipped to do this by stacking an array of transistors on top of each other. They are interconnected and feed off of each other, altering their weights when prompted, without burning energy to move around data each time.
The Future Is Bright .
With this breakthrough, researchers are starting to think outside the box about how to create sustainable AI. For example, brain-inspired computing, where processing happens on chips mimicking neurons, could be a serious contender.
Adiabatic computing, which preserves energy state between on and off elements, can also be used for data centres. Photonic computing, which builds computers out of light, is another possibility.
It’s still too early to call which one will make it—but these and other prospects are hope that a greener computing environment is on the horizon.
As we move closer to the commercialisation of these and other green computing solutions, new research papers, initiatives, and projects in this direction are sure to follow. With governments and other organisations focusing on decarbonisation and increased sustainability, the future of AI in the near-long term looks to be powered with renewable energy sources.
Share