GradMFL: Gradient Memory-Based Federated Learning for Hierarchical Knowledge Transferring Over Non-IID Data

Guanghui Tong, Gaolei Li*, Jun Wu, Jianhua Li

*この研究の対応する著者

研究成果: Conference contribution

5 被引用数 (Scopus)

抄録

The massive datasets are often collected under non-IID distribution scenarios, which enforces existing federated learning (FL) frameworks to be still struggling on the model accuracy and convergence. To achieve heterogeneity-aware collaborative training, the FL server aggregates gradients from different clients to ingest and transfer common knowledge behind non-IID data, while leading to information loss and bias due to statistical weighting. To address the above issues, we propose a Gradient Memory-based Federated Learning (GradMFL) framework, which enables Hierarchical Knowledge Transferring over Non-IID Data. In GradMFL, a data clustering method is proposed to categorize Non-IID data to IID data according to the similarity. And then, in order to enable beneficial knowledge transferring between hierarchical clusters, we also present a multi-stage model training mechanism using gradient memory, constraining the updating directions. Experiments on solving a set of classification tasks based on benchmark datasets have shown the strong performance of good accuracy and high efficiency.

本文言語English
ホスト出版物のタイトルAlgorithms and Architectures for Parallel Processing - 21st International Conference, ICA3PP 2021, Proceedings
編集者Yongxuan Lai, Tian Wang, Min Jiang, Guangquan Xu, Wei Liang, Aniello Castiglione
出版社Springer Science and Business Media Deutschland GmbH
ページ612-626
ページ数15
ISBN(印刷版)9783030953836
DOI
出版ステータスPublished - 2022
外部発表はい
イベント21st International Conference on Algorithms and Architectures for Parallel Processing, ICA3PP 2021 - Virtual, Online
継続期間: 2021 12月 32021 12月 5

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
13155 LNCS
ISSN(印刷版)0302-9743
ISSN(電子版)1611-3349

Conference

Conference21st International Conference on Algorithms and Architectures for Parallel Processing, ICA3PP 2021
CityVirtual, Online
Period21/12/321/12/5

ASJC Scopus subject areas

  • 理論的コンピュータサイエンス
  • コンピュータサイエンス一般

フィンガープリント

「GradMFL: Gradient Memory-Based Federated Learning for Hierarchical Knowledge Transferring Over Non-IID Data」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル