TY - GEN
T1 - Toward Privacy-Aware Efficient Federated Graph Attention Network in Smart Cloud
AU - Zhou, Jinhao
AU - Su, Zhou
AU - Wang, Yuntao
AU - Pan, Yanghe
AU - Pan, Qianqian
AU - Liu, Lizheng
AU - Wu, Jun
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Federated graph attention networks (FGATs), blending federated learning (FL) with graph attention networks (GAT), present a novel paradigm for collaborative, privacy-conscious graph model training in the smart cloud. FGATs leverage distributed attention mechanisms to enhance graph feature prioritization, improving representation learning while preserving data decentralization. Despite their advancements, FGATs face privacy concerns, such as attribute inference. Our study proposes an efficient privacy-preserving FGAT (PFGAT). We devise an improved multiplication triplet (IMT)-based attention mechanism with a hybrid differential privacy (DP) approach. We invent a novel triplet generation method and a hybrid neighbor aggregation algorithm, specifically designed to respect the distinct traits of neighbor nodes, efficiently secures GAT node embeddings. Evaluations on benchmarks such as Cora, Citeseer, and Pubmed demonstrate PFGAT's ability to safeguard privacy without compromising on efficiency or performance.
AB - Federated graph attention networks (FGATs), blending federated learning (FL) with graph attention networks (GAT), present a novel paradigm for collaborative, privacy-conscious graph model training in the smart cloud. FGATs leverage distributed attention mechanisms to enhance graph feature prioritization, improving representation learning while preserving data decentralization. Despite their advancements, FGATs face privacy concerns, such as attribute inference. Our study proposes an efficient privacy-preserving FGAT (PFGAT). We devise an improved multiplication triplet (IMT)-based attention mechanism with a hybrid differential privacy (DP) approach. We invent a novel triplet generation method and a hybrid neighbor aggregation algorithm, specifically designed to respect the distinct traits of neighbor nodes, efficiently secures GAT node embeddings. Evaluations on benchmarks such as Cora, Citeseer, and Pubmed demonstrate PFGAT's ability to safeguard privacy without compromising on efficiency or performance.
KW - Federated learning
KW - attention mechanism
KW - differential privacy
KW - graph neural network
KW - secure computation
KW - smart cloud
UR - http://www.scopus.com/inward/record.url?scp=85198035942&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85198035942&partnerID=8YFLogxK
U2 - 10.1109/SmartCloud62736.2024.00011
DO - 10.1109/SmartCloud62736.2024.00011
M3 - Conference contribution
AN - SCOPUS:85198035942
T3 - Proceedings - 2024 IEEE 9th International Conference on Smart Cloud, SmartCloud 2024
SP - 19
EP - 24
BT - Proceedings - 2024 IEEE 9th International Conference on Smart Cloud, SmartCloud 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 9th IEEE International Conference on Smart Cloud, SmartCloud 2024
Y2 - 10 May 2024 through 12 May 2024
ER -