Loading [MathJax]/extensions/MathZoom.js
Defending Against Data and Model Backdoor Attacks in Federated Learning | IEEE Journals & Magazine | IEEE Xplore

Defending Against Data and Model Backdoor Attacks in Federated Learning


Abstract:

Federated learning (FL) can complete collaborative model training without transferring local data, which can greatly improve the training efficiency. However, FL is susce...Show More

Abstract:

Federated learning (FL) can complete collaborative model training without transferring local data, which can greatly improve the training efficiency. However, FL is susceptible data and model backdoor attacks. To address data backdoor attack, in this article, we propose a defense method named TSF. TSF transforms data from time domain to frequency domain and subsequently designs a low-pass filter to mitigate the impact of high-frequency signals introduced by backdoor samples. Additionally, we undergo homomorphic encryption on local updates to prevent the server from inferring user’s data. We also introduce a defense method against model backdoor attack named ciphertext field similarity detect differential privacy (CFSD-DP). CFSD-DP screens malicious updates using cosine similarity detection in the ciphertext domain. It perturbs the global model using differential privacy mechanism to mitigate the impact of model backdoor attack. It can effectively detect malicious updates and safeguard the privacy of the global model. Experimental results show that the proposed TSF and CFSD-DP have 73.8% degradation in backdoor accuracy while only 3% impact on the main task accuracy compared with state-of-the-art schemes. Code is available at https://github.com/whwh456/TSF.
Published in: IEEE Internet of Things Journal ( Volume: 11, Issue: 24, 15 December 2024)
Page(s): 39276 - 39294
Date of Publication: 18 June 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.