Ilyas Bayanbayev, Hongjian Shi, and Ruhui Ma, “Balancing Efficiency and Personalization in Federated Learning via Blockwise Knowledge Distillation,” Chinese Journal of Electronics, vol. x, no. x, pp. 1–3, xxxx. DOI: 10.23919/cje.2023.00.424
Citation: Ilyas Bayanbayev, Hongjian Shi, and Ruhui Ma, “Balancing Efficiency and Personalization in Federated Learning via Blockwise Knowledge Distillation,” Chinese Journal of Electronics, vol. x, no. x, pp. 1–3, xxxx. DOI: 10.23919/cje.2023.00.424

Balancing Efficiency and Personalization in Federated Learning via Blockwise Knowledge Distillation

  • Machine unlearning enables systems to forget specific training data, vital in Federated Learning (FL) for privacy. Our paper presents the FedBW framework, merging machine unlearning with FL challenges. By employing Federated Blockwise Distillation, it enhances privacy and efficiency, allowing tailored learning for clients while upholding the ‘right to be forgotten.’ Utilizing logits from unlabeled datasets, FedBW manages model heterogeneity without exposing sensitive data. Experimental results on CIFAR-10 and CIFAR-100 with VGG16 show improved accuracy and training efficiency, setting a precedent for future privacy-preserving machine learning applications.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return