Scalable Smartphone Cluster for Deep Learning

Software portability Deep Neural Networks
DOI: 10.48550/arxiv.2110.12172 Publication Date: 2021-01-01
ABSTRACT
Various deep learning applications on smartphones have been rapidly rising, but training neural networks (DNNs) has too large computational burden to be executed a single smartphone. A portable cluster, which connects with wireless network and supports parallel computation using them, can potential approach resolve the issue. However, by our findings, limitations of communication restrict cluster size up 30 smartphones. Such small-scale clusters insufficient power train DNNs from scratch. In this paper, we propose scalable smartphone enabling removing portability increase its efficiency. The 138 Galaxy S10+ devices wired Ethernet. We implemented large-batch synchronous based Caffe, library. yielded 90% speed P100 when ResNet-50, approximately 43x speed-up V100 MobileNet-v1.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....