Poster
Multimodal Quantitative Language for Generative Recommendation
Jianyang Zhai · Zi-Feng Mai · Chang-Dong Wang · Feidiao Yang · Xiawu Zheng · Hui Li · Yonghong Tian
Hall 3 + Hall 2B #192
Generative recommendation has emerged as a promising paradigm aiming at directly generating the identifiers of the target candidates.Most existing methods attempt to leverage prior knowledge embedded in Pre-trained Language Models (PLMs) to improve the recommendation performance. However, they often fail to accommodate the differences between the general linguistic knowledge of PLMs and the specific needs of recommendation systems. Moreover, they rarely consider the complementary knowledge between the multimodal information of items, which represents the multi-faceted preferences of users. To facilitate efficient recommendation knowledge transfer, we propose a novel approach called Multimodal Quantitative Language for Generative Recommendation (MQL4GRec). Our key idea is to transform items from different domains and modalities into a unified language, which can serve as a bridge for transferring recommendation knowledge. Specifically, we first introduce quantitative translators to convert the text and image content of items from various domains into a new and concise language, known as quantitative language, with all items sharing the same vocabulary. Then, we design a series of quantitative language generation tasks to enrich quantitative language with semantic information and prior knowledge. Finally, we achieve the transfer of recommendation knowledge from different domains and modalities to the recommendation task through pre-training and fine-tuning. We evaluate the effectiveness of MQL4GRec through extensive experiments and comparisons with existing methods, achieving improvements over the baseline by 11.18\%, 14.82\%, and 7.95\% on the NDCG metric across three different datasets, respectively. Our implementation is available at: \href{https://anonymous.4open.science/r/QL4GRec-ED65/}{\textcolor{blue}{https://anonymous.4open.science/r/MQL4GRec-ED65/}.}
Live content is unavailable. Log in and register to view live content