L̳e̳v̳e̳r̳a̳g̳i̳n̳g̳ ̳A̳u̳t̳o̳-̳D̳i̳s̳t̳i̳l̳l̳a̳t̳i̳o̳n̳ ̳a̳n̳d̳ ̳G̳e̳n̳e̳r̳a̳t̳i̳v̳e̳ ̳S̳e̳l̳f̳-̳S̳u̳p̳e̳r̳v̳i̳s̳e̳d̳ ̳L̳e̳a̳r̳n̳i̳n̳g̳ ̳i̳n̳ ̳R̳e̳s̳i̳d̳u̳a̳l̳ ̳G̳r̳a̳p̳h̳ ̳T̳r̳a̳n̳s̳f̳o̳r̳m̳e̳r̳s̳ ̳f̳o̳r̳ ̳R̳e̳c̳o̳m̳m̳e̳n̳d̳e̳r̳ ̳S̳y̳stems
Conférence : Communications avec actes dans un congrès international
This paper introduces a cutting-edge method for enhancing recommender systems through the integration of generative self-supervised learning (SSL) with a Residual Graph Transformer. Our approach emphasizes the importance of superior data enhancement through the use of pertinent pretext tasks, automated through rationale-aware SSL to distill clear ways of how users and items interact. The Residual Graph Transformer incorporates a topology-aware transformer for global context and employs residual connections to improve graph representation learning. Additionally, an auto-distillation process refines self-supervised signals to uncover consistent collaborative rationales. Experimental evaluations on multiple datasets demonstrate that our approach consistently outperforms baseline methods.