Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    In: Security and Safety, EDP Sciences, Vol. 2 ( 2023), p. 2023006-
    Abstract: Federated learning(FL) development has grown increasingly strong with the increased emphasis on data for individuals and industry. Federated learning allows individual participants to jointly train a global model without sharing local data, which significantly enhances data privacy. However, federated learning is vulnerable to poisoning attacks by malicious participants. Since federated learning does not have access to the participants’ training process, i.e. , attackers can compromise the global model by uploading elaborate malicious local updates to the server under the guise of normal participants. Current model poisoning attacks usually add small perturbations to the local model after it is trained to craft harmful local updates and the attacker finds the appropriate perturbation size to bypass robust detection methods and corrupt the global model as much as possible. In contrast, we propose a novel model poisoning attack based on the momentum of history information (MPHM), that is, the attacker makes new malicious updates by dynamically crafting perturbations using the historical information in the local training, which will make the new malicious updates more effective and stealthy. Our attack aims to indiscriminately reduce the testing accuracy of the global model with minimal information. Experiments show that in the classical defense case, our attack can significantly corrupt the accuracy of the global model compared to other advanced poisoning attacks.
    Type of Medium: Online Resource
    ISSN: 2826-1275
    Language: English
    Publisher: EDP Sciences
    Publication Date: 2023
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages