TY - GEN
T1 - Utilizing BERT Pretrained Models with Various Fine-Tune Methods for Subjectivity Detection
AU - Huo, Hairong
AU - Iwaihara, Mizuho
N1 - Publisher Copyright:
© 2020, Springer Nature Switzerland AG.
PY - 2020
Y1 - 2020
N2 - As an essentially antecedent task of sentiment analysis, subjectivity detection refers to classifying sentences to be subjective ones containing opinions, or objective and neutral ones without bias. In the situations where impartial language is required, such as Wikipedia, subjectivity detection could play an important part. Recently, pretrained language models have proven to be effective in learning representations, profoundly boosting the performance among several NLP tasks. As a state-of-art pretrained model, BERT is trained on large unlabeled data with masked word prediction and next sentence prediction tasks. In this paper, we mainly explore utilizing BERT pretrained models with several combinations of fine-tuning methods, holding the intention to enhance performance in subjectivity detection task. Our experimental results reveal that optimum combinations of fine-tune and multi-task learning surplus the state-of-the-art on subjectivity detection and related tasks.
AB - As an essentially antecedent task of sentiment analysis, subjectivity detection refers to classifying sentences to be subjective ones containing opinions, or objective and neutral ones without bias. In the situations where impartial language is required, such as Wikipedia, subjectivity detection could play an important part. Recently, pretrained language models have proven to be effective in learning representations, profoundly boosting the performance among several NLP tasks. As a state-of-art pretrained model, BERT is trained on large unlabeled data with masked word prediction and next sentence prediction tasks. In this paper, we mainly explore utilizing BERT pretrained models with several combinations of fine-tuning methods, holding the intention to enhance performance in subjectivity detection task. Our experimental results reveal that optimum combinations of fine-tune and multi-task learning surplus the state-of-the-art on subjectivity detection and related tasks.
KW - BERT
KW - Fine-tuning
KW - Multi-task learning
KW - Pretrained language model
KW - Subjectivity detection
UR - http://www.scopus.com/inward/record.url?scp=85093854660&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85093854660&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-60290-1_21
DO - 10.1007/978-3-030-60290-1_21
M3 - Conference contribution
AN - SCOPUS:85093854660
SN - 9783030602895
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 270
EP - 284
BT - Web and Big Data - 4th International Joint Conference, APWeb-WAIM 2020, Proceedings
A2 - Wang, Xin
A2 - Zhang, Rui
A2 - Lee, Young-Koo
A2 - Sun, Le
A2 - Moon, Yang-Sae
PB - Springer Science and Business Media Deutschland GmbH
T2 - 4th Asia-Pacific Web and Web-Age Information Management, Joint Conference on Web and Big Data, APWeb-WAIM 2020
Y2 - 18 September 2020 through 20 September 2020
ER -