Federated Learning (FL) is vulnerable to various attacks, including data poisoning and model poisoning, which can degrade the quality of the global model. Current solutions are based on cryptographic primitives or byzantine aggregation but suffer from performance and/or quality worsening and are even ineffective in case of data poisoning. This work proposes a game-theoretic solution to identify malicious weights and differentiate between benign and compromised updates. We use the Prisoner's Dilemma and Signaling Games to model interactions between local learners and the aggregator, allowing a precise evaluation of the legitimacy of shared weights. Upon detecting an attack, the system activates a rollback mechanism to restore the model to a safe state. The proposed approach enhances FL robustness by mitigating attack impacts while preserving the global model's generalization capabilities.

Federated Learning under Attack: Game-Theoretic Mitigation of Data Poisoning

De Santis M.
;
Esposito C.
2025

Abstract

Federated Learning (FL) is vulnerable to various attacks, including data poisoning and model poisoning, which can degrade the quality of the global model. Current solutions are based on cryptographic primitives or byzantine aggregation but suffer from performance and/or quality worsening and are even ineffective in case of data poisoning. This work proposes a game-theoretic solution to identify malicious weights and differentiate between benign and compromised updates. We use the Prisoner's Dilemma and Signaling Games to model interactions between local learners and the aggregator, allowing a precise evaluation of the legitimacy of shared weights. Upon detecting an attack, the system activates a rollback mechanism to restore the model to a safe state. The proposed approach enhances FL robustness by mitigating attack impacts while preserving the global model's generalization capabilities.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11386/4920437
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact