Masked prompt learning for formal analogies beyond words

Liyan Wang*, Yves Lepage

*この研究の対応する著者

研究成果: Conference article査読

1 被引用数 (Scopus)

抄録

Prompt learning, a recent thread in few-shot learning for pre-trained language models (PLMs), has been explored for completing word analogies in the extractive way. In this paper, we reformulate the analogy task as masked analogy completion task with the use of prompting to derive a generative model for analogies beyond words. We introduce a simple prompt-based fine-tuning paradigm for language modeling on answered prompts of analogies in the sequence-to-sequence framework. To convert discrete terms of analogies into linear sequences, we present a symbolic prompt template. The sequence-to-sequence model is fine-tuned to fill in the missing span of masked prompts deduced from different masking schemes on phrase analogies extracted from a small corpus. We analyze the out-of-distribution performance on sentence analogies which are unseen cases. Our experiments demonstrate that prompt-based fine-tuning with the objective of language modeling enables models to achieve significantly better performance on in-distribution cases than PLMs. Masked prompt learning with one-term masking exhibits the best out-of-distribution generalization on sentence analogies, with a difference of only 3 characters from references.

本文言語English
ページ(範囲)1-14
ページ数14
ジャーナルCEUR Workshop Proceedings
3174
出版ステータスPublished - 2022
イベント1st Workshop on the Interactions between Analogical Reasoning and Machine Learning at 31st International Joint Conference on Artificial Intelligence - 25th European Conference on Artificial Intelligence, IARML@IJCAI-ECAI 2022 - Vienna, Austria
継続期間: 2022 7月 23 → …

ASJC Scopus subject areas

  • コンピュータ サイエンス(全般)

フィンガープリント

「Masked prompt learning for formal analogies beyond words」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル