Associate professor of computer science Felix Xiaozhu Lin and his co-author, Hongyu Miao, a former Ph.D. advisee at Purdue University, won the best paper award at the Seventh ACM/IEEE Symposium on Edge Computing in Seattle last month.
Lin and Miao's paper, “Towards Out-of-core Neural Networks on Microcontrollers,” addresses constraints imposed by memory size on microcontroller units to run large neural networks. The team demonstrated a technique to enable microcontroller units to run large neural networks with acceptable cost-benefit tradeoffs.
Their findings mean microcontroller units, previously thought unsuitable for many important use cases, can be deployed for these functions, such as analyzing traffic patterns from city cameras, understanding speech in smart homes and smart spaces, and detecting anomalies in manufacturing environments.The SEC symposium, sponsored in part by the Association of Computing Machinery and the Institute of Electrical and Electronics Engineers, brings together top researchers, engineers, students, entrepreneurs and government officials to discuss the opportunities and challenges that arise from rethinking cloud computing architectures and embracing edge computing.
Lin is a participating faculty member in UVA Engineering's Link Lab, a multidisciplinary center for cyber-physical research.
PAPER ABSTRACT
To run neural networks (NNs) on microcontroller units (MCUs), memory size is the major constraint. While algorithm-level techniques exist to reduce NN memory footprints, the resultant losses in NN accuracy and generality disqualify MCUs for many important use cases. To address the constraint, we investigate out-of-core execution of NNs on MCUs: dynamically swapping NN data tiles between an MCU's small SRAM and its large, low-cost external flash. Accordingly, we present a scheduler design that automatically schedules compute tasks and swapping IO tasks in order to minimize the IO overhead in swapping. Out-of-core NNs on MCUs raise multiple concerns: execution slowdown, storage wear out, energy consumption, and data security. Our empirical study shows that none of these concerns is a showstopper; the key benefit MCUs being able to run large NNs with full accuracy/generality trumps the overheads. Our findings suggest that MCUs can play a much greater role in edge intelligence.