Nkamdjin Njike, GhislainGhislainNkamdjin NjikeHoang, Anh-TuAnh-TuHoangSchulte, StefanStefanSchulte2025-12-192025-12-192025-10IEEE International Conference on Blockchain, Blockchain 2025https://hdl.handle.net/11420/60373Distributed learning (DL) is gaining popularity as it enables clients (e.g., AI Agents) to enhance their machine learning (ML) models’ performance by exchanging knowledge without revealing private datasets. State-of-the-art DL approaches primarily focus on transferring knowledge between heterogeneous clients with diverse model architectures, connecting clients with those that can improve their models, and protecting data privacy. However, they overlook the threat of malicious clients that potentially downgrade the models’ performance by sharing inaccurate knowledge or excluding high-performing clients from the training procedure.Therefore, we introduce the Zero-Knowledge Blockchain-Based Knowledge Distillation Learning Framework (zkBKD). In zkBKD, heterogeneous clients communicate with a blockchain network to discover high-performing clients, verify zero-knowledge proofs (ZKPs) to ensure the correctness of the knowledge shared from other clients, and vote to eliminate malicious clients. We analyze security and privacy risks and show that zkBKD prevents membership, poisoning, and collusion attacks. We conduct extensive experiments on two standard datasets across heterogeneous clients with four model architectures. The experimental results demonstrate that zkBKD relatively improves the average model accuracy of all clients by 25.71%. Even lightweight models such as ResNet-2 achieve up to a 103.35% accuracy gain compared to independent training.endistributed learningknowledge distillationzero-knowledge proofsblockchainComputer Science, Information and General Works::005: Computer Programming, Programs, Data and Security::005.7: DataSecuring blockchain-based distributed learning for heterogeneous clients through knowledge distillation and zero-knowledge proofsConference Paper10.1109/blockchain67634.2025.00029Conference Paper