EMAT: An Efficient Multi-Task Architecture for Transfer Learning using ReRAM

TitleEMAT: An Efficient Multi-Task Architecture for Transfer Learning using ReRAM
Publication TypeConference Paper
Year of Publication2018
AuthorsF Chen, and H Li
Conference NameIeee/Acm International Conference on Computer Aided Design, Digest of Technical Papers, Iccad
Date Published11/2018
Abstract

Transfer learning has demonstrated a great success recently towards general supervised learning to mitigate expensive training efforts. However, existing neural network accelerators have been proven inefficient in executing transfer learning by failing to accommodate the layer-wise heterogeneity in computation and memory requirements. In this work, we propose EMAT - -an efficient multi-task architecture for transfer learning built on resistive memory (ReRAM) technology. EMAT utilizes the energy-efficiency of ReRAM arrays for matrix-vector multiplication and realizes a hierarchical reconfigurable design with heterogeneous computation components to incorporate the data patterns in transfer learning. Compared to the GPU platform, EMAT can perform averagely 120X performance speedup and 87X energy saving. EMAT also obtains 2.5X speedup compared to the-state-of-the-art CMOS accelerator.

DOI10.1145/3240765.3240805