Group Scissor: Scaling Neuromorphic Computing Design to Large Neural Networks

Abstract

Synapse crossbar is an elementary structure in neuromorphic computing systems (NCS). However, the limited size of crossbars and heavy routing congestion impede the NCS implementation of large neural networks. In this paper, we propose a two-step framework (namely, group scissor) to scale NCS designs to large neural networks. The first step rank clipping integrates low-rank approximation into the training to reduce total crossbar area. The second step is group connection deletion, which structurally prunes connections to reduce routing congestion between crossbars. Tested on convolutional neural networks of LeNet on MNIST database and ConvNet on CIFAR-10 database, our experiments show significant reduction of crossbar and routing area in NCS designs. Without accuracy loss, rank clipping reduces the total crossbar area to 13.62% or 51.81% in the NCS design of LeNet or ConvNet, respectively. The following group connection deletion further decreases the routing area of LeNet or ConvNet to 8.1% or 52.06%, respectively.

DOI
10.1145/3061639.3062256
Year