Loading [MathJax]/extensions/MathZoom.js
Federated Learning of Neural Network Models with Heterogeneous Structures | IEEE Conference Publication | IEEE Xplore

Federated Learning of Neural Network Models with Heterogeneous Structures


Abstract:

Federated learning trains a model on a centralized server using datasets distributed over a large number of edge devices. Applying federated learning ensures data privacy...Show More

Abstract:

Federated learning trains a model on a centralized server using datasets distributed over a large number of edge devices. Applying federated learning ensures data privacy because it does not transfer local data from edge devices to the server. Existing federated learning algorithms assume that all deployed models share the same structure. However, it is often infeasible to distribute the same model to every edge device because of hardware limitations such as computing performance and storage space. This paper proposes a novel federated learning algorithm to aggregate information from multiple heterogeneous models. The proposed method uses weighted average ensemble to combine the outputs from each model. The weight for the ensemble is optimized using black box optimization methods. We evaluated the proposed method using diverse models and datasets and found that it can achieve comparable performance to conventional training using centralized datasets. Furthermore, we compared six different optimization methods to tune the weights for the weighted average ensemble and found that tree parzen estimator achieves the highest accuracy among the alternatives.
Date of Conference: 14-17 December 2020
Date Added to IEEE Xplore: 23 February 2021
ISBN Information:
Conference Location: Miami, FL, USA

Contact IEEE to Subscribe

References

References is not available for this document.