TY - JOUR
T1 - SSE4Rec
T2 - Sequential recommendation with subsequence extraction
AU - Deng, Hangyu
AU - Hu, Jinglu
N1 - Publisher Copyright:
© 2024 Elsevier B.V.
PY - 2024/2/15
Y1 - 2024/2/15
N2 - Sequential recommendation mines the sequential patterns in user behavior data to recommend items to users. Recent studies have mainly followed the language modeling paradigm, on the premise that the next item depends on the sequence of previous items. Notably, differences exist between user behavior and textual data. One key difference is that behavioral sequences can encompass multiple intentions, unlike sentences that typically express a single intention. Furthermore, behavioral sequences emerge freely from users, whereas sentences conform to grammatical rules. This study highlights the risk of treating behavior sequences as a unified sequence, and the resultant potential for overfitting the observed transitions. We mitigated this risk by using subsequence extraction for recommendation (SSE4Rec). This model employs a subsequence extraction module that disperses items into distinct subsequences and groups of related items. Each subsequence is then processed by an independent downstream sequence model, which discourages the memorization of inconsequential transitions. Both the training and inference strategies are inherently integrated into the model. The proposed method was evaluated on four public datasets, whereby it was demonstrated to outperform publicly available alternatives or deliver competitive results. The properties of the model were also explored, further visualizing the output of the subsequence extraction module.
AB - Sequential recommendation mines the sequential patterns in user behavior data to recommend items to users. Recent studies have mainly followed the language modeling paradigm, on the premise that the next item depends on the sequence of previous items. Notably, differences exist between user behavior and textual data. One key difference is that behavioral sequences can encompass multiple intentions, unlike sentences that typically express a single intention. Furthermore, behavioral sequences emerge freely from users, whereas sentences conform to grammatical rules. This study highlights the risk of treating behavior sequences as a unified sequence, and the resultant potential for overfitting the observed transitions. We mitigated this risk by using subsequence extraction for recommendation (SSE4Rec). This model employs a subsequence extraction module that disperses items into distinct subsequences and groups of related items. Each subsequence is then processed by an independent downstream sequence model, which discourages the memorization of inconsequential transitions. Both the training and inference strategies are inherently integrated into the model. The proposed method was evaluated on four public datasets, whereby it was demonstrated to outperform publicly available alternatives or deliver competitive results. The properties of the model were also explored, further visualizing the output of the subsequence extraction module.
KW - Implicit feedback
KW - Recommendation systems
KW - Sequential recommendation
UR - http://www.scopus.com/inward/record.url?scp=85182878791&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85182878791&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2023.111364
DO - 10.1016/j.knosys.2023.111364
M3 - Article
AN - SCOPUS:85182878791
SN - 0950-7051
VL - 285
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 111364
ER -