Low-power neuromorphic speech recognition engine with coarse-grain sparsity

TitleLow-power neuromorphic speech recognition engine with coarse-grain sparsity
Publication TypeConference Paper
Year of Publication2017
AuthorsS Yin, D Kadetotad, B Yan, C Song, Y Chen, C Chakrabarti, and JS Seo
Conference NameProceedings of the Asia and South Pacific Design Automation Conference, Asp Dac
Date Published02/2017

In recent years, we have seen a surge of interest in neuromorphic computing and its hardware design for cognitive applications. In this work, we present new neuromorphic architecture, circuit, and device co-designs that enable spike-based classification for speech recognition task. The proposed neuromorphic speech recognition engine supports a sparsely connected deep spiking network with coarse granularity, leading to large memory reduction with minimal Index information Simulation results show that the proposed deep spiking neural network accelerator achieves phoneme error rate (PER) of 20.5% for TIMIT database, and consume 2.57mW in 40nm CMOS for real-time performance. To alleviate the memory bottleneck, the usage of non-volatile memory is also evaluated and discussed.