MeDNN: A distributed mobile system with enhanced partition and deployment for large-scale DNNs

TitleMeDNN: A distributed mobile system with enhanced partition and deployment for large-scale DNNs
Publication TypeConference Paper
Year of Publication2017
AuthorsJ Mao, Z Yang, W Wen, C Wu, L Song, KW Nixon, X Chen, H Li, and Y Chen
Conference NameIeee/Acm International Conference on Computer Aided Design, Digest of Technical Papers, Iccad
Date Published12/2017
Abstract

Deep Neural Networks (DNNs) are pervasively used in a significant number of applications and platforms. To enhance the execution efficiency of large-scale DNNs, previous attempts focus mainly on client-server paradigms, relying on powerful external infrastructure, or model compression, with complicated pre-processing phases. Though effective, these methods overlook the optimization of DNNs on distributed mobile devices. In this work, we design and implement MeDNN, a local distributed mobile computing system with enhanced partitioning and deployment tailored for large-scale DNNs. In MeDNN, we first propose Greedy Two Dimensional Partition (GTDP), which can adaptively partition DNN models onto several mobile devices w.r.t. individual resource constraints. We also propose Structured Model Compact Deployment (SMCD), a mobile-friendly compression scheme which utilizes a structured sparsity pruning technique to further accelerate DNN execution. Experimental results show that, GTDP can accelerate the original DNN execution time by 1.86-2.44x with 2-4 worker nodes. By utilizing SMCD, 26.5% of additional computing time and 14.2% of extra communication time are saved, on average, with negligible effect on the model accuracy.

DOI10.1109/ICCAD.2017.8203852