Augmented Negative Sampling for Collaborative Filtering logo

Augmented Negative Sampling for Collaborative Filtering

To balance efficiency and effectiveness, the vast majority of existing methods follow the two-pass approach, in which the first pass samples a fixed number of unobserved items by a simple static distribution and then the second pass selects the final negative items using a more sophisticated negative sampling strategy.

GitHub Link

The GitHub link is https://github.com/asa9aotk/ans-recbole

Introduce

The GitHub repository "Asa9aoTK/ANS-RecBole" contains an implementation of ANS using the RecBole framework. The ANS method can be found in the directory "recbole/model/general_recommender." The repository includes four distinct base models, and among them, LightGCN is offered as a base model version for ease of access. The code for BPR-MF, SGL, and NGCF is also present, made accessible to readers through straightforward adjustments. To balance efficiency and effectiveness, the vast majority of existing methods follow the two-pass approach, in which the first pass samples a fixed number of unobserved items by a simple static distribution and then the second pass selects the final negative items using a more sophisticated negative sampling strategy.

Content

We implemented ANS using RecBole framework, please find our method under recbole/model/general_recommender. We have used four different base models in this paper, and we have provided LightGCN as a version of the base model for your convenience. The code for BPR-MF, SGL, and NGCF is available to the reader with simple modifications.

Alternatives & Similar Tools

LongLLaMA-handle very long text contexts, up to 256,000 tokens logo

LongLLaMA is a large language model designed to handle very long text contexts, up to 256,000 tokens. It's based on OpenLLaMA and uses a technique called Focused Transformer (FoT) for training. The repository provides a smaller 3B version of LongLLaMA for free use. It can also be used as a replacement for LLaMA models with shorter contexts.