Balancing Efficiency and Personalization in Federated Learning via Blockwise Knowledge Distillation
-
Abstract
Machine unlearning enables systems to forget specific training data, vital in Federated Learning (FL) for privacy. Our paper presents the FedBW framework, merging machine unlearning with FL challenges. By employing Federated Blockwise Distillation, it enhances privacy and efficiency, allowing tailored learning for clients while upholding the ‘right to be forgotten.’ Utilizing logits from unlabeled datasets, FedBW manages model heterogeneity without exposing sensitive data. Experimental results on CIFAR-10 and CIFAR-100 with VGG16 show improved accuracy and training efficiency, setting a precedent for future privacy-preserving machine learning applications.
-
-