Communication-Efficient and Privacy-Preserving Federated Learning via Joint Knowledge Distillation and Differential Privacy in Bandwidth-Constrained Networks

Gad Gad, Eyad Gad, Zubair Md Fadlullah, Mostafa M. Fouda, Nei Kato

研究成果: ジャーナルへの寄稿学術論文査読

18 被引用数 (Scopus)

抄録

The development of high-quality deep learning models demands the transfer of user data from edge devices, where it originates, to centralized servers. This central training approach has scalability limitations and poses privacy risks to private data. Federated Learning (FL) is a distributed training framework that empowers physical smart systems devices to collaboratively learn a task without sharing private training data with a central server. However, FL introduces new challenges to Beyond 5G (B5G) networks, such as communication overhead, system heterogeneity, and privacy concerns, as the exchange of model updates may still lead to data leakage. This paper explores the communication overhead and privacy risks facing the implementation of FL and presents an algorithm that encompasses Knowledge Distillation (KD) and Differential Privacy (DP) techniques to address these challenges in FL. We compare the operational flow and network model of model-based and model-agnostic (KD-based) FL algorithms that enable customizing per-client model architecture to accommodate heterogeneous and constrained system resources. Our experiments show that KD-based FL algorithms are able to exceed local accuracy and achieve comparable accuracy to central training. Additionally, we show that applying DP to KD-based FL significantly damages its utility, leading to up to 70% accuracy loss for a privacy budget ϵ ≤ 10.

本文言語英語
ページ(範囲)17586-17601
ページ数16
ジャーナルIEEE Transactions on Vehicular Technology
73
11
DOI
出版ステータス出版済み - 2024

フィンガープリント

「Communication-Efficient and Privacy-Preserving Federated Learning via Joint Knowledge Distillation and Differential Privacy in Bandwidth-Constrained Networks」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル