Abstract
Federated Unlearning (FU) aims to delete specific training data from an ML model trained using Federated Learning (FL). However, existing FU methods suffer from inefficiencies due to the high costs associated with gradient recomputation and storage. This paper presents QuickDrop, an original and efficient FU approach designed to overcome these limitations. During model training, each client uses QuickDrop to generate a compact synthetic dataset, serving as a compressed representation of the gradient information utilized during training. This synthetic dataset facilitates fast gradient approximation, allowing rapid downstream unlearning at minimal storage cost. To unlearn some knowledge from the trained model, QuickDrop clients execute stochastic gradient ascent with samples from the synthetic datasets instead of the training dataset. The tiny volume of synthetic data significantly reduces computational overhead compared to conventional FU methods. Evaluations with three standard datasets and five baselines show that, with comparable accuracy guarantees, QuickDrop reduces the unlearning duration by 463× compared to retraining the model from scratch and 65−218× compared to FU baselines. QuickDrop supports both class- and client-level unlearning, multiple unlearning requests, and relearning of previously erased data.
| Original language | English |
|---|---|
| Title of host publication | Middleware 2024 - Proceedings of the 25th ACM International Middleware Conference |
| Publisher | Association for Computing Machinery, Inc |
| Pages | 266-278 |
| Number of pages | 13 |
| ISBN (Electronic) | 9798400706233 |
| DOIs | |
| Publication status | Published - 2 Dec 2024 |
| Event | 25th ACM International Middleware Conference, Middleware 2024 - Hong Kong, Hong Kong Duration: 2 Dec 2024 → 6 Dec 2024 |
Publication series
| Name | Middleware 2024 - Proceedings of the 25th ACM International Middleware Conference |
|---|
Conference
| Conference | 25th ACM International Middleware Conference, Middleware 2024 |
|---|---|
| Country/Territory | Hong Kong |
| City | Hong Kong |
| Period | 2/12/24 → 6/12/24 |
Bibliographical note
Publisher Copyright:© 2024 Copyright held by the owner/author(s).
Keywords
- Dataset Distillation
- Federated Learning
- Federated Unlearning
- Machine Unlearning
- Privacy and Security