Abstract:
Federated learning (FL) has emerged as a pivotal paradigm for distributed model training in edge computing (EC), enabling cooperation among numerous Internet of Things de...Show MoreMetadata
Abstract:
Federated learning (FL) has emerged as a pivotal paradigm for distributed model training in edge computing (EC), enabling cooperation among numerous Internet of Things devices while safeguarding their data privacy. Despite its successes in machine learning, concerns regarding data security and model fidelity necessitate the efficient unlearning of target device, i.e., federated unlearning (FUN). However, due to resource constraints, device heterogeneity, and non-independent and identically distributed (Non-IID) data, securely eliminating a device’s impact without retraining the model from scratch presents a complex challenge. In response to these challenges, we propose a hierarchical FUN framework, called Hier-FUN. Hier-FUN organizes edge devices into K clusters, each managed by a head device responsible for aggregating local models within the cluster. To expedite both the learning and unlearning processes of Hier-FUN, we design a heuristic algorithm to determine an appropriate value for K based on devices’ data distributions and available resources. In addition, Hier-FUN denies the communication between the server and cluster heads during training, which can constrain the influence sphere of target device and accelerate the unlearning process. We conduct extensive experiments using real-world datasets, and the experimental results illustrate that Hier-FUN can improve test accuracy by 3.19% during the learning phase and achieve a 6.8\times speedup during unlearning compared with the baseline methods.
Published in: IEEE Internet of Things Journal ( Volume: 12, Issue: 7, 01 April 2025)