Thomas, JuliusJuliusThomasSaile, FinnFinnSaileFischer, MathiasMathiasFischerSchallmoser, DominikDominikSchallmoserSchulte, StefanStefanSchulte2025-03-312025-03-312025-02Lecture notes in computer science 15547 LNCS: 3-17 (2025)978-3-031-84617-5978-3-031-84616-8978-3-031-84618-2https://hdl.handle.net/11420/55048Federated Learning is a Machine Learning paradigm in which multiple devices jointly train a shared model without the need to share their local data. Most Federated Learning approaches assume that the data remains static. However, this assumption is unrealistic in real-world scenarios where data may change over time. In this paper we study the impact of concept drift on Federated Learning models with heterogeneous data distribution. Client selection directly impacts the model’s performance, and we observe that the local loss of clients under concept drift deviates. Nevertheless, most applications select clients randomly for each training round. Our main contribution therefore is a probabilistic client selection algorithm that introduces a bias towards clients with higher local loss in order to counter concept drift. Extensive evaluations show that our algorithm recovers a drop in accuracy incurred by concept drift up to five times faster than random sampling while retaining competitive performance.enClient Selection | Concept Drift | Federated Learning | Machine LearningTechnology::600: TechnologyAdaption via selection: on client selection to counter concept drift in Federated LearningConference Paper10.1007/978-3-031-84617-5_1Conference Paper