Speaker
Boris Murmann
(University of Hawaii)
Description
Over the past decade, machine learning algorithms have been deployed in many cloud-centric applications. However, as the application space continues to grow, various algorithms are now embedded “closer to the sensor,” eliminating the latency, privacy and energy penalties associated with cloud access. In this talk, I will review circuit techniques that can improve the energy efficiency of low-power machine learning inference algorithms at the extreme edge. Specific examples include analog feature extraction for image and audio processing, as well as low-energy compute fabrics for convolutional neural networks.