SHELOB-FFL: addressing Systems HEterogeneity with LOcally Backpropagated Forward-Forward Learning | IEEE Conference Publication | IEEE Xplore

SHELOB-FFL: addressing Systems HEterogeneity with LOcally Backpropagated Forward-Forward Learning


Abstract:

Federated Learning (FL) is a method for training Machine Learning (ML) models across various clients while maintaining data privacy. Addressing the challenge of client re...Show More

Abstract:

Federated Learning (FL) is a method for training Machine Learning (ML) models across various clients while maintaining data privacy. Addressing the challenge of client resource diversity, this paper presents a novel approach combining the Forward-Forward (FF) algorithm with Back Propagation (BP). This integration forms a blockwise network structure that achieves robust convergence without the chain rule, dividing the model into subnetworks for efficient training. The strategy allows dynamic allocation of network segments to clients based on their computational resources, enabling independent optimization of subnetworks, thus preventing delays and memory issues. Experiments in IID and non-IID settings across datasets assess the methodology’s viability, focusing on the impact of data and label distribution on convergence. The study also examines weight aggregation and regularization techniques like FedAvg and FedProx, adapting them to understand their effect on this FL approach. Source code available at: https://github.com/MODALUNINA/SHELOB_FFL
Date of Conference: 23-23 July 2024
Date Added to IEEE Xplore: 04 September 2024
ISBN Information:

ISSN Information:

Conference Location: Jersey City, NJ, USA

Contact IEEE to Subscribe

References

References is not available for this document.