FedSkel: Efficient Federated Learning on Heterogeneous Systems with Skeleton Gradients Update

Image credit: Unsplash

Abstract

Federated learning aims to protect users’ privacy while performing data analysis from different participants. However, it is challenging to guarantee the training efficiency on heterogeneous systems due to the various computational capabilities and communication bottlenecks. In this work, we propose FedSkel to enable computation-efficient and communication-efficient federated learning on edge devices by only updating the model’s essential parts, named skeleton networks. FedSkel is evaluated on real edge devices with imbalanced datasets. Experimental results show that it could achieve up to 5.52x speedups for CONV layers’ back-propagation, 1.82x speedups for the whole training process, and reduce 64.8% communication cost, with negligible accuracy loss.

Publication
In CIKM 2021
Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.