Communication Compression for Decentralized Learning With Operator Splitting Methods | IEEE Journals & Magazine | IEEE Xplore

Communication Compression for Decentralized Learning With Operator Splitting Methods


Abstract:

In decentralized learning, operator splitting methods using a primal-dual formulation (e.g., Edge-Consensus Learning (ECL)) have been shown to be robust to heterogeneous ...Show More

Abstract:

In decentralized learning, operator splitting methods using a primal-dual formulation (e.g., Edge-Consensus Learning (ECL)) have been shown to be robust to heterogeneous data and have attracted significant attention in recent years. However, in the ECL, a node needs to exchange dual variables with its neighbors. These exchanges incur significant communication costs. For the Gossip-based algorithms, many compression methods have been proposed, but these Gossip-based algorithms do not perform well when the data distribution held by each node is statistically heterogeneous. In this work, we propose a novel framework of the compression methods for the ECL, called the Communication Compressed ECL (C-ECL). Specifically, we reformulate the update formulas of the ECL and propose to compress the update values of the dual variables. We demonstrate experimentally that the C-ECL can achieve a nearly equivalent performance with fewer parameter exchanges than the ECL. Moreover, we demonstrate that the C-ECL is more robust to heterogeneous data than the Gossip-based algorithms.
Page(s): 581 - 595
Date of Publication: 25 August 2023

ISSN Information:

Funding Agency:


References

References is not available for this document.