GradMFL: Gradient Memory-Based Federated Learning for Hierarchical Knowledge Transferring Over Non-IID Data

Guanghui Tong, Gaolei Li*, Jun Wu, Jianhua Li

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

The massive datasets are often collected under non-IID distribution scenarios, which enforces existing federated learning (FL) frameworks to be still struggling on the model accuracy and convergence. To achieve heterogeneity-aware collaborative training, the FL server aggregates gradients from different clients to ingest and transfer common knowledge behind non-IID data, while leading to information loss and bias due to statistical weighting. To address the above issues, we propose a Gradient Memory-based Federated Learning (GradMFL) framework, which enables Hierarchical Knowledge Transferring over Non-IID Data. In GradMFL, a data clustering method is proposed to categorize Non-IID data to IID data according to the similarity. And then, in order to enable beneficial knowledge transferring between hierarchical clusters, we also present a multi-stage model training mechanism using gradient memory, constraining the updating directions. Experiments on solving a set of classification tasks based on benchmark datasets have shown the strong performance of good accuracy and high efficiency.

Original languageEnglish
Title of host publicationAlgorithms and Architectures for Parallel Processing - 21st International Conference, ICA3PP 2021, Proceedings
EditorsYongxuan Lai, Tian Wang, Min Jiang, Guangquan Xu, Wei Liang, Aniello Castiglione
PublisherSpringer Science and Business Media Deutschland GmbH
Pages612-626
Number of pages15
ISBN (Print)9783030953836
DOIs
Publication statusPublished - 2022
Externally publishedYes
Event21st International Conference on Algorithms and Architectures for Parallel Processing, ICA3PP 2021 - Virtual, Online
Duration: 2021 Dec 32021 Dec 5

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13155 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference21st International Conference on Algorithms and Architectures for Parallel Processing, ICA3PP 2021
CityVirtual, Online
Period21/12/321/12/5

Keywords

  • Federated learning
  • Gradient memory
  • Hierarchical clustering
  • Knowledge transferring
  • Non-IID data

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'GradMFL: Gradient Memory-Based Federated Learning for Hierarchical Knowledge Transferring Over Non-IID Data'. Together they form a unique fingerprint.

Cite this