TY - GEN
T1 - Improving Sequential Recommendation via Subsequence Extraction
AU - Deng, Hangyu
AU - Hu, Jinglu
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - The temporal order of user behaviors, which implies the user's preference in the near future, plays a key role in sequential recommendation systems. To capture such patterns from user behavior sequences, many recent works borrow ideas from language models and consider it a next item prediction problem. It is reasonable, but the gap between the user behavior data and the text data is ignored. Generally speaking, user behaviors are more arbitrary than sentences in natural languages. A behavior sequence usually carries multiple intentions, and the exact order does not matter a lot. But a sentence in a text tends to express one meaning and different orders of the words may bring very different meanings. To address these issues, this study considers user behavior as a mixture of multiple subsequences. Specifically, we introduce a subsequence extraction module, which assigns the items in a sequence into different subsequences, with respect to their relationship. Then these subsequences are fed into the downstream sequence model, from which we obtain several user representations. To train the whole system in an end-to-end manner, we design a new training strategy where only the user representation near the target item gets supervised. To verify the effectiveness of our method, we conduct extensive experiments on four public datasets. It is compared with several baselines and achieves better results in most cases. Further experiments explore the properties of our model and we also visualize the result of the subsequence extraction.
AB - The temporal order of user behaviors, which implies the user's preference in the near future, plays a key role in sequential recommendation systems. To capture such patterns from user behavior sequences, many recent works borrow ideas from language models and consider it a next item prediction problem. It is reasonable, but the gap between the user behavior data and the text data is ignored. Generally speaking, user behaviors are more arbitrary than sentences in natural languages. A behavior sequence usually carries multiple intentions, and the exact order does not matter a lot. But a sentence in a text tends to express one meaning and different orders of the words may bring very different meanings. To address these issues, this study considers user behavior as a mixture of multiple subsequences. Specifically, we introduce a subsequence extraction module, which assigns the items in a sequence into different subsequences, with respect to their relationship. Then these subsequences are fed into the downstream sequence model, from which we obtain several user representations. To train the whole system in an end-to-end manner, we design a new training strategy where only the user representation near the target item gets supervised. To verify the effectiveness of our method, we conduct extensive experiments on four public datasets. It is compared with several baselines and achieves better results in most cases. Further experiments explore the properties of our model and we also visualize the result of the subsequence extraction.
KW - recommendation systems
KW - sequence model
KW - sequential recommendation
UR - http://www.scopus.com/inward/record.url?scp=85140748707&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85140748707&partnerID=8YFLogxK
U2 - 10.1109/IJCNN55064.2022.9892221
DO - 10.1109/IJCNN55064.2022.9892221
M3 - Conference contribution
AN - SCOPUS:85140748707
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2022 International Joint Conference on Neural Networks, IJCNN 2022 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 International Joint Conference on Neural Networks, IJCNN 2022
Y2 - 18 July 2022 through 23 July 2022
ER -