A Simple and Effective Usage of Self-supervised Contrastive Learning for Text Clustering

Haoxiang Shi, Cen Wang, Tetsuya Sakai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Contrastive learning is a promising approach to unsupervised learning, as it inherits the advantages of well-studied deep models without a dedicated and complex model design. In this paper, based on bidirectional encoder representations from transformers, we propose self-supervised contrastive learning (SCL) as well as few-shot contrastive learning (FCL) with unsupervised data augmentation (UDA) for text clustering. SCL outperforms state-of-the-art unsupervised clustering approaches for short texts and those for long texts in terms of several clustering evaluation measures. FCL achieves performance close to supervised learning, and FCL with UDA further improves the performance for short texts.

Original languageEnglish
Title of host publication2021 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages315-320
Number of pages6
ISBN (Electronic)9781665442077
DOIs
Publication statusPublished - 2021
Event2021 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2021 - Melbourne, Australia
Duration: 2021 Oct 172021 Oct 20

Publication series

NameConference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
ISSN (Print)1062-922X

Conference

Conference2021 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2021
Country/TerritoryAustralia
CityMelbourne
Period21/10/1721/10/20

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Control and Systems Engineering
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'A Simple and Effective Usage of Self-supervised Contrastive Learning for Text Clustering'. Together they form a unique fingerprint.

Cite this