Options
FLUID: federated learning with unlearning and instant drift-recovery
Publikationstyp
Conference Paper
Date Issued
2025-10
Sprache
English
Author(s)
Start Page
394
End Page
401
Citation
3rd International Conference on Federated Learning Technologies and Applications, FLTA 2025
Contribution to Conference
Publisher DOI
Publisher
IEEE
ISBN of container
979-8-3315-5670-9
979-8-3315-5671-6
Federated Learning (FL) enables collaborative model training without sharing raw data, but struggles with concept drift, where client data distributions evolve over time. Existing approaches either retain outdated knowledge or suffer from high computational costs due to frequent retraining. This work introduces Federated Learning with Unlearning and Instant Driftrecovery (FLUID), a novel framework that treats concept drift as a selective forgetting problem. By leveraging techniques from Federated Unlearning, FLUID removes obsolete knowledge while adapting to new data patterns. It combines the lightweight unlearning capability of Federated Auxiliary Unlearning (FedAU) with the rapid recovery capability of Rapid Retraining (RRT), without requiring full retraining. Experimental results show that FLUID achieves balanced, all-around performance matching RRT in recovery speed and better convergence, and FedAU in lower accuracy dips during drift, and it outperforms natural recovery capabilities of Federated Averaging (FedAvg), and retraining from scratch.
Subjects
Machine Learning
Federated Learning
Federated Unlearning
Concept Drift
DDC Class
600: Technology
006.31: Machine Learning