Evaluating Complexity and Resilience Trade-offs in Emerging Memory Inference Machines
Abstract
Neuromorphic-style inference only works well if limited hardware resources are maximized properly, e.g. accuracy continues to scale with parameters and complexity in the face of potential disturbance. In this work, we use realistic crossbar simulations to highlight that compact implementations of deep neural networks are unexpectedly susceptibilie to collapse from multiple system disturbances. Our work proposes a middle path towards high performance and strong resilience utilizing the Mosaics framework, and specifically by re-using synaptic connections in a recurrent neural network implementation that possesses a natural form of noise-immunity.
BibTeX
@inproceedings{bennett2020evaluating,
author = {Christopher H. Bennett and Ryan Dellana and T. Patrick Xiao and Ben Feinberg and Sapan Agarwal and Suma Cardwell and Matthew J. Marinella and William M. Severa and Brad Aimone},
title = {{Evaluating Complexity and Resilience Trade-offs in Emerging Memory Inference Machines}},
booktitle = {Neuro-Inspired Computational Elements Workshop (NICE)},
year = {2020},
month = {mar},
address = {Heidelberg, Germany},
doi = {10.1145/3381755.3381782}
}