TY - GEN
T1 - Wireless Coded Distributed Learning with Gaussian-based Local Differential Privacy
AU - Xue, Yilei
AU - Lin, Xi
AU - Wu, Jun
AU - Li, Jianhua
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Differentially private distributed machine learning protects privacy by injecting artificial noise to the computing results. To further improve energy efficiency, the natural noise in the wireless environment can be used to protect privacy. In this paper, we study the problem of coded distributed machine learning over Gaussian multiple-access wireless channels to achieve differential privacy by exploiting the natural noise. Firstly, we propose an aggregation scheme using differentially private Lagrange encoding in a wireless environment, where the local computing results are uploaded to the master through orthogonal channels. Then, we develop an achievable privacy protection level to illustrate the impact of transmit power and power allocation on privacy. Additionally, we establish a theoretical convergence upper bound of the proposed scheme, providing a clear understanding of the potential limitations and capabilities of the system. Finally, we demonstrate a trade-off between system resource settings, convergence, and privacy protection levels through experiments. Specifically, increasing the signal-to-noise ratio (SNR) and power allocated for gradient computation leads to a decrease in the privacy protection level of the system and an increase in training accuracy. Moreover, reducing the dataset partitions results in better training accuracy.
AB - Differentially private distributed machine learning protects privacy by injecting artificial noise to the computing results. To further improve energy efficiency, the natural noise in the wireless environment can be used to protect privacy. In this paper, we study the problem of coded distributed machine learning over Gaussian multiple-access wireless channels to achieve differential privacy by exploiting the natural noise. Firstly, we propose an aggregation scheme using differentially private Lagrange encoding in a wireless environment, where the local computing results are uploaded to the master through orthogonal channels. Then, we develop an achievable privacy protection level to illustrate the impact of transmit power and power allocation on privacy. Additionally, we establish a theoretical convergence upper bound of the proposed scheme, providing a clear understanding of the potential limitations and capabilities of the system. Finally, we demonstrate a trade-off between system resource settings, convergence, and privacy protection levels through experiments. Specifically, increasing the signal-to-noise ratio (SNR) and power allocated for gradient computation leads to a decrease in the privacy protection level of the system and an increase in training accuracy. Moreover, reducing the dataset partitions results in better training accuracy.
UR - http://www.scopus.com/inward/record.url?scp=85171421363&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85171421363&partnerID=8YFLogxK
U2 - 10.1109/ISIT54713.2023.10206774
DO - 10.1109/ISIT54713.2023.10206774
M3 - Conference contribution
AN - SCOPUS:85171421363
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 1943
EP - 1948
BT - 2023 IEEE International Symposium on Information Theory, ISIT 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 IEEE International Symposium on Information Theory, ISIT 2023
Y2 - 25 June 2023 through 30 June 2023
ER -