Options
Adaption via selection: on client selection to counter concept drift in Federated Learning
Publikationstyp
Conference Paper
Date Issued
2025-02
Sprache
English
Author(s)
Schallmoser, Dominik
First published in
Number in series
15547 LNCS
Start Page
3
End Page
17
Citation
Lecture notes in computer science 15547 LNCS: 3-17 (2025)
Contribution to Conference
Publisher DOI
Scopus ID
Publisher
Springer
ISBN
978-3-031-84617-5
978-3-031-84616-8
978-3-031-84618-2
Federated Learning is a Machine Learning paradigm in which multiple devices jointly train a shared model without the need to share their local data. Most Federated Learning approaches assume that the data remains static. However, this assumption is unrealistic in real-world scenarios where data may change over time. In this paper we study the impact of concept drift on Federated Learning models with heterogeneous data distribution. Client selection directly impacts the model’s performance, and we observe that the local loss of clients under concept drift deviates. Nevertheless, most applications select clients randomly for each training round. Our main contribution therefore is a probabilistic client selection algorithm that introduces a bias towards clients with higher local loss in order to counter concept drift. Extensive evaluations show that our algorithm recovers a drop in accuracy incurred by concept drift up to five times faster than random sampling while retaining competitive performance.
Subjects
Client Selection | Concept Drift | Federated Learning | Machine Learning
DDC Class
600: Technology