Kumar, AmanAmanKumarNisal HemadasaKaaser, DominikDominikKaaserSchulte, StefanStefanSchulte2026-02-092026-02-092025-103rd International Conference on Federated Learning Technologies and Applications, FLTA 2025https://hdl.handle.net/11420/61455Federated Learning (FL) enables collaborative model training without sharing raw data, but struggles with concept drift, where client data distributions evolve over time. Existing approaches either retain outdated knowledge or suffer from high computational costs due to frequent retraining. This work introduces Federated Learning with Unlearning and Instant Driftrecovery (FLUID), a novel framework that treats concept drift as a selective forgetting problem. By leveraging techniques from Federated Unlearning, FLUID removes obsolete knowledge while adapting to new data patterns. It combines the lightweight unlearning capability of Federated Auxiliary Unlearning (FedAU) with the rapid recovery capability of Rapid Retraining (RRT), without requiring full retraining. Experimental results show that FLUID achieves balanced, all-around performance matching RRT in recovery speed and better convergence, and FedAU in lower accuracy dips during drift, and it outperforms natural recovery capabilities of Federated Averaging (FedAvg), and retraining from scratch.enMachine LearningFederated LearningFederated UnlearningConcept DriftTechnology::600: TechnologyComputer Science, Information and General Works::006: Special computer methods::006.3: Artificial Intelligence::006.31: Machine LearningFLUID: federated learning with unlearning and instant drift-recoveryConference Paper10.1109/flta67013.2025.11336437Conference Paper