Stelldinger, PeerPeerStelldingerIbrahim, Mustafa Fuad RifetMustafa Fuad RifetIbrahim2024-08-092024-08-092024-07-0116th International Symposium on Intelligent Distributed Computing (IDC 2023)9783031600227https://hdl.handle.net/11420/48726Privacy is a well-researched area in the context of Federated Learning. Typically, ensuring privacy means that individual data used for local training cannot be reconstructed by other local learners or a central server. Thus, it is the individual data points that should be private, but not the entire distribution of locally available data. In many cases, this makes sense because each data point comes from a different individual while all data points originate from a common global distribution. In this position paper, we address a more challenging task where the privacy of each local data distribution must be preserved. This is relevant for use cases where there is a one-to-one mapping from local learners to users, such as when each local learner is part of a personalized assistant on a smartphone. We provide a definition of this problem case, describe the challenges that need to be addressed, and formulate a possible approach to solve the problem.enFederated LearningLocal Distribution PrivacyMutual InformationComputer Science, Information and General Works::004: Computer SciencesLocal distribution privacy inĀ federated learningConference Paper10.1007/978-3-031-60023-4_4Conference Paper