Abstract:
In federated learning, each edge device executes the training locally and uploads local parameters onto the server for the further model aggregation. The major difference...Show MoreMetadata
Abstract:
In federated learning, each edge device executes the training locally and uploads local parameters onto the server for the further model aggregation. The major difference between federated learning and distributed learning is that client devices generate and process their data locally without exposing their original data. But, it increases the communication cost between the server and clients because of the iterative training. In this work, we formulate the problem of participant selections and propose a framework for reducing the communication cost of federated learning by considering internal and external similarities. We also introduce some potential methods for computing these similarities.
Date of Conference: 07-08 October 2021
Date Added to IEEE Xplore: 15 November 2021
ISBN Information: