Analog Chips Find a New Lease of Life in Artificial Intelligence

The need for speed is a hot topic among attendees at this week’s AI Hardware Summit—bigger AI language models, faster chips, and more bandwidth for AI machines to make accurate predictions.

However, some hardware startups are taking a fallback approach to AI computing to counter the more-is-better approach. Companies like Innatera, Rain Neuromorphics, and others are developing silicon brains with analog circuitry to mimic brain functionality.

The brain is analog in nature and takes in raw sensory data, and these chipmakers are trying to recreate how the brain’s neurons and synapses work in traditional analog circuits.

Analog chips can be very good low-power sensor devices, particularly for some audio and video applications, said Kevin Krewell, an analyst at Tirias Research.

“Analog is a more accurate representation of how the brain acts, using distributed memory cells to hold neuron weights, or otherwise holding an analog weight,” Krewell said.

AI and machine learning mostly rely on digital chips at the edge or in data centers. But there is a place for analog chips at the edge, like in smartphones or cars, that need instant intelligence without sending data to the cloud, which is used to provide AI services.

“We don’t want to replace the entire AI pipeline,” said Sumeet Kumar, CEO of Innatera Nanosystems BV, based in Rijkswijk, the Netherlands.

Innatera’s third-generation AI chip has 256 neurons and 65,000 synapses, which doesn’t sound like much compared to the human brain, which has 86 billion neurons and operates at around 20 watts. But Kumar said it’s possible to create a fully connected recurring network beyond that, and the chip can run on coin-cell batteries.

The chip is used by customers to run radar and audio applications with performance competitive with other chips in the same class. The goal of the chip is to incorporate a low level of on-device learning and inference, which is seen as a major challenge for AI among attendees at the show.

“What we’re trying to do, what we’re recognizing, is that when data is transferred from a sensor to the cloud, it’s actually being transformed in multiple phases by different types of AI. And what we see very often is customers processing low-level sensor data in the cloud, which is completely unnecessary,” said Kumar.

The Innatera chip takes in information coming from a sensor, which is converted into spikes, and the content of the input is encoded at the precise moment those spikes occur.

“That’s exactly what happens in your brain. If you hear anything, there are… tiny hairs [cells] in your ear that actually detects each frequency band and how high the energy is in that band. And that hair [cells] will vibrate, creating spikes that will then travel to the rest of your auditory cortex. Essentially, we follow exactly the same principle,” said Kumar.

Based on this principle, calcium ions and low sodium ions are found in the neurons of the brain, and these concentrations change over time. Innatera’s chip replicates the same behavior using currents.

“We scale how much current goes into the neuron and comes out of the neuron. That’s how we mimic the brain,” Kumar said.

The idea isn’t to disrupt the current flow of AI to the cloud, but to replace the current crowd of AI chips on the edge that aren’t capable of making on-device decisions. The chip also reduces the process of converting analog signals to digital.

“You can’t really translate an analog signal over a long distance because then you actually have degradation. We avoid that by converting this analog signal into a spike,” said Kumar.

The basis of today’s AI is based on simulating the action of the brain’s neurons using digital chips and techniques, which is very successful. Based on advances in Moore’s Law, these digital circuits and networks have gotten bigger and faster.

But analog has its problems. For example, calibration issues like drift are more difficult to achieve consistency across analog chips.

“Analog circuits and memory cells don’t scale like digital circuits. And most of the time, the analog has to eventually be converted to digital in order to interact with the rest of the system,” Krewell said.

Certainly, the concept of neuromorphic chips is not new. Companies like Intel and IBM have developed brain-inspired chips, and universities are developing their own versions using analog circuits. Intel and others have been raising awareness of the difference between neuromorphic chips and traditional AI, but the startups have felt the need to get their products to market as the computational demands and power efficiency of AI are increasing at an unsustainable pace.

Another AI chip company, Rain Neuromorphics, said its chip, which mimics the brain, would be used in particle accelerators at Argonne National Laboratory.

In a presentation at the AI ​​Hardware Summit, the company didn’t give many details about how the chip would be used, but the company’s CEO, Gordon Wilson, said the chip would act like a silicon brain that would help the research lab study and draw conclusions about particle collisions.

The silicon brain will provide on-device intelligence to protect against sensor drift, which can result in erroneous data being sent to AI systems. The concept of sensor drift is similar to model drift in AI, where bad data fed into a learning model can throw the AI ​​system off track.

Wilson claimed that the chip’s on-device capabilities are more power efficient compared to AI in the cloud.

“You need the ability to learn spontaneously. You have to have the ability to train and fine-tune that sensor drift to keep that system performing,” Wilson said.

The first iteration of the Rain chip “essentially won’t look radically different from … other analog or mixed chips,” Wilson said. But it will have the ability to learn, which will unlock more value.

Wilson pointed to possible different types of memory, such as memristor circuits, that provide the ability to learn. Memristors have been developed since the 1960s and were pursued by HP (which later became HPE) for use in a megacomputer called The Machine, but the technology still remains a novelty.

“Memristor serves as memory resistor. It is a resistance that can adjust its resistance. It’s used as an artificial synapse,” Wilson said. In a brain, synapses don’t have to be perfect, and the requirements will be different for Rain’s memristors.

Venture capitalist Sam Altman, known for his work in AI as CEO of OpenAI, invested $25 million in Rain Neuromorphics earlier this year.