Could analog artificial intelligence (AI) hardware rather than digital tap fast, low-energy processing to solve machine learning’s rising costs and carbon footprint? Researchers say yes: Logan Wright and Tatsuhiro Onodera, research scientists at NTT Research and Cornell University, envision a future where machine learning (ML) will be performed with novel physical hardware, such as those based on photonics or nanomechanics. These unconventional devices, they say, could be applied in both edge and server settings. Deep neural networks, which are at the heart of today’s AI efforts, hinge on the heavy use of digital processors like GPUs. But for years, there have been concerns about the monetary and environmental cost of machine learning, which increasingly limits the scalability of deep learning models. A 2019 paper out of the University of Massachusetts, Amherst, for example, performed a life cycle assessment for training several common large AI models. It found that the process can emit more than 626,000 pounds of carbon dioxide equivalent nearly five times the lifetime emissions of the average American car, including the manufacturing of the car itself. CEO Kazu Gomi machine learning doesn’t have to rely on digital circuits, but instead can run on a physical neural network. This is a type of artificial neural network in which physical analog hardware is used to emulate neurons as opposed to software-based approaches.
One of the obvious benefits of using analog systems rather than digital is AI’s energy consumption,The consumption issue is real, so the question is what are new ways to make machine learning faster and more energy-efficient?”
Analog AI: More like the brain?From the early history of AI, people weren’t trying to think about how to make digital computers, Wright pointed out.
They were trying to think about how we could emulate the brain, which of course is not digital, What I have in my head is an analog system, and it’s actually much more efficient at performing the types of calculations that go on in deep neural networks than today’s digital logic circuits.”The brain is one example of analog hardware for doing AI, but others include systems that use optics. “My favorite example is waves, because a lot of things like optics are based on waves, in a bathtub, for instance, you could formulate the problem to encode a set of numbers. At the front of the bathtub, you can set up a wave and the height of the wave gives you this vector X. You let the system evolve for some time and the wave propagates to the other end of the bathtub. After some time, you can then measure the height of that, and that gives you another set of numbers.” Essentially, nature itself can perform computations. “And you don’t need to plug it into anything.”
Analog AI hardware approachesResearchers across the industry are using a variety of approaches to developing analog hardware. IBM Research, for example, has invested in analog electronics, memristor technology, to perform machine learning calculations.
“It’s quite promising,These memristor circuits have the property of having information be naturally computed by nature as the electrons ‘flow’ through the circuit, allowing them to have potentially much lower energy consumption than digital electronics.”NTT Research, however, is focused on a more general framework that isn’t limited to memristor technology. “Our work is focused on also enabling other physical systems, for instance those based on light and mechanics (sound), to perform machine learning, by doing so, we can make smart sensors in the native physical domain where the information is generated, such as in the case of a smart microphone or a smart camera.” Startups including Mythic also focus on analog AI using electronics which Wright says is a “great step, and it is probably the lowest risk way to get into analog neural networks.” But it’s also incremental and has a limited ceiling, “There is only so much improvement in performance that is possible if the hardware is still based on electronics.”