Utilizing BERT Pretrained Models with Various Fine-Tune Methods for Subjectivity Detection

Hairong Huo*, Mizuho Iwaihara

*この研究の対応する著者

研究成果: Conference contribution

6 被引用数 (Scopus)

抄録

As an essentially antecedent task of sentiment analysis, subjectivity detection refers to classifying sentences to be subjective ones containing opinions, or objective and neutral ones without bias. In the situations where impartial language is required, such as Wikipedia, subjectivity detection could play an important part. Recently, pretrained language models have proven to be effective in learning representations, profoundly boosting the performance among several NLP tasks. As a state-of-art pretrained model, BERT is trained on large unlabeled data with masked word prediction and next sentence prediction tasks. In this paper, we mainly explore utilizing BERT pretrained models with several combinations of fine-tuning methods, holding the intention to enhance performance in subjectivity detection task. Our experimental results reveal that optimum combinations of fine-tune and multi-task learning surplus the state-of-the-art on subjectivity detection and related tasks.

本文言語English
ホスト出版物のタイトルWeb and Big Data - 4th International Joint Conference, APWeb-WAIM 2020, Proceedings
編集者Xin Wang, Rui Zhang, Young-Koo Lee, Le Sun, Yang-Sae Moon
出版社Springer Science and Business Media Deutschland GmbH
ページ270-284
ページ数15
ISBN(印刷版)9783030602895
DOI
出版ステータスPublished - 2020
イベント4th Asia-Pacific Web and Web-Age Information Management, Joint Conference on Web and Big Data, APWeb-WAIM 2020 - Tianjin, China
継続期間: 2020 9月 182020 9月 20

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
12318 LNCS
ISSN(印刷版)0302-9743
ISSN(電子版)1611-3349

Conference

Conference4th Asia-Pacific Web and Web-Age Information Management, Joint Conference on Web and Big Data, APWeb-WAIM 2020
国/地域China
CityTianjin
Period20/9/1820/9/20

ASJC Scopus subject areas

  • 理論的コンピュータサイエンス
  • コンピュータ サイエンス(全般)

フィンガープリント

「Utilizing BERT Pretrained Models with Various Fine-Tune Methods for Subjectivity Detection」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル