A configurable, tunable, and reproducible library for CTR prediction


Keywords
ctr, prediction, recommender, systems, cvr, pytorch, ctr-prediction, recommender-systems
License
Apache-2.0
Install
pip install fuxictr==2.3.0

Documentation

Logo
Python version Pytorch version Pytorch version Pypi version Downloads License

Click-through rate (CTR) prediction is a critical task for various industrial applications such as online advertising, recommender systems, and sponsored search. FuxiCTR provides an open-source library for CTR prediction, with key features in configurability, tunability, and reproducibility. We hope this project could promote reproducible research and benefit both researchers and practitioners in this field.

Key Features

  • Configurable: Both data preprocessing and models are modularized and configurable.

  • Tunable: Models can be automatically tuned through easy configurations.

  • Reproducible: All the benchmarks can be easily reproduced.

  • Extensible: It can be easily extended to any new models, supporting both Pytorch and Tensorflow frameworks.

Model Zoo

No Publication Model Paper Benchmark Version
πŸ“‚ Feature Interaction Models
1 WWW'07 LR Predicting Clicks: Estimating the Click-Through Rate for New Ads 🚩Microsoft ↗️ torch
2 ICDM'10 FM Factorization Machines ↗️ torch
3 CIKM'13 DSSM Learning Deep Structured Semantic Models for Web Search using Clickthrough Data 🚩Microsoft ↗️ torch
4 CIKM'15 CCPM A Convolutional Click Prediction Model ↗️ torch
5 RecSys'16 FFM Field-aware Factorization Machines for CTR Prediction 🚩Criteo ↗️ torch
6 RecSys'16 DNN Deep Neural Networks for YouTube Recommendations 🚩Google ↗️ torch, tf
7 DLRS'16 Wide&Deep Wide & Deep Learning for Recommender Systems 🚩Google ↗️ torch, tf
8 ICDM'16 PNN Product-based Neural Networks for User Response Prediction ↗️ torch
9 KDD'16 DeepCrossing Deep Crossing: Web-Scale Modeling without Manually Crafted Combinatorial Features 🚩Microsoft ↗️ torch
10 NIPS'16 HOFM Higher-Order Factorization Machines ↗️ torch
11 IJCAI'17 DeepFM DeepFM: A Factorization-Machine based Neural Network for CTR Prediction 🚩Huawei ↗️ torch, tf
12 SIGIR'17 NFM Neural Factorization Machines for Sparse Predictive Analytics ↗️ torch
13 IJCAI'17 AFM Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks ↗️ torch
14 ADKDD'17 DCN Deep & Cross Network for Ad Click Predictions 🚩Google ↗️ torch, tf
15 WWW'18 FwFM Field-weighted Factorization Machines for Click-Through Rate Prediction in Display Advertising 🚩Oath, TouchPal, LinkedIn, Alibaba ↗️ torch
16 KDD'18 xDeepFM xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems 🚩Microsoft ↗️ torch
17 CIKM'19 FiGNN FiGNN: Modeling Feature Interactions via Graph Neural Networks for CTR Prediction ↗️ torch
18 CIKM'19 AutoInt/AutoInt+ AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks ↗️ torch
19 RecSys'19 FiBiNET FiBiNET: Combining Feature Importance and Bilinear feature Interaction for Click-Through Rate Prediction 🚩Sina Weibo ↗️ torch
20 WWW'19 FGCNN Feature Generation by Convolutional Neural Network for Click-Through Rate Prediction 🚩Huawei ↗️ torch
21 AAAI'19 HFM/HFM+ Holographic Factorization Machines for Recommendation ↗️ torch
22 Arxiv'19 DLRM Deep Learning Recommendation Model for Personalization and Recommendation Systems 🚩Facebook ↗️ torch
23 NeuralNetworks'20 ONN Operation-aware Neural Networks for User Response Prediction ↗️ torch, tf
24 AAAI'20 AFN/AFN+ Adaptive Factorization Network: Learning Adaptive-Order Feature Interactions ↗️ torch
25 AAAI'20 LorentzFM Learning Feature Interactions with Lorentzian Factorization 🚩eBay ↗️ torch
26 WSDM'20 InterHAt Interpretable Click-through Rate Prediction through Hierarchical Attention 🚩NEC Labs, Google ↗️ torch
27 DLP-KDD'20 FLEN FLEN: Leveraging Field for Scalable CTR Prediction 🚩Tencent ↗️ torch
28 CIKM'20 DeepIM Deep Interaction Machine: A Simple but Effective Model for High-order Feature Interactions 🚩Alibaba, RealAI ↗️ torch
29 WWW'21 FmFM FM^2: Field-matrixed Factorization Machines for Recommender Systems 🚩Yahoo ↗️ torch
30 WWW'21 DCN-V2 DCN V2: Improved Deep & Cross Network and Practical Lessons for Web-scale Learning to Rank Systems 🚩Google ↗️ torch
31 CIKM'21 DESTINE Disentangled Self-Attentive Neural Networks for Click-Through Rate Prediction 🚩Alibaba ↗️ torch
32 CIKM'21 EDCN Enhancing Explicit and Implicit Feature Interactions via Information Sharing for Parallel Deep CTR Models 🚩Huawei ↗️ torch
33 DLP-KDD'21 MaskNet MaskNet: Introducing Feature-Wise Multiplication to CTR Ranking Models by Instance-Guided Mask 🚩Sina Weibo ↗️ torch
34 SIGIR'21 SAM Looking at CTR Prediction Again: Is Attention All You Need? 🚩BOSS Zhipin ↗️ torch
35 KDD'21 AOANet Architecture and Operation Adaptive Network for Online Recommendations 🚩Didi Chuxing ↗️ torch
36 AAAI'23 FinalMLP FinalMLP: An Enhanced Two-Stream MLP Model for CTR Prediction 🚩Huawei ↗️ torch
37 SIGIR'23 FinalNet FINAL: Factorized Interaction Layer for CTR Prediction 🚩Huawei ↗️ torch
38 SIGIR'23 EulerNet EulerNet: Adaptive Feature Interaction Learning via Euler's Formula for CTR Prediction 🚩Huawei ↗️ torch
39 CIKM'23 GDCN Towards Deeper, Lighter and Interpretable Cross Network for CTR Prediction 🚩Microsoft torch
40 ICML'24 WuKong Wukong: Towards a Scaling Law for Large-Scale Recommendation 🚩Meta torch
41 Arxiv'24 DCNv3 DCNv3: Towards Next Generation Deep Cross Network for Click-Through Rate Prediction ↗️ torch
πŸ“‚ Behavior Sequence Modeling
42 KDD'18 DIN Deep Interest Network for Click-Through Rate Prediction 🚩Alibaba ↗️ torch
43 AAAI'19 DIEN Deep Interest Evolution Network for Click-Through Rate Prediction 🚩Alibaba ↗️ torch
44 DLP-KDD'19 BST Behavior Sequence Transformer for E-commerce Recommendation in Alibaba 🚩Alibaba ↗️ torch
45 CIKM'20 DMIN Deep Multi-Interest Network for Click-through Rate Prediction 🚩Alibaba ↗️ torch
46 AAAI'20 DMR Deep Match to Rank Model for Personalized Click-Through Rate Prediction 🚩Alibaba ↗️ torch
47 DLP-KDD'22 ETA Efficient Long Sequential User Data Modeling for Click-Through Rate Prediction 🚩Alibaba torch
48 CIKM'22 SDIM Sampling Is All You Need on Modeling Long-Term User Behaviors for CTR Prediction 🚩Meituan torch
49 KDD'23 TransAct TransAct: Transformer-based Realtime User Action Model for Recommendation at Pinterest 🚩Pinterest ↗️ torch
πŸ“‚ Dynamic Weight Network
50 NeurIPS'22 APG APG: Adaptive Parameter Generation Network for Click-Through Rate Prediction 🚩Alibaba ↗️ torch
51 KDD'23 PPNet PEPNet: Parameter and Embedding Personalized Network for Infusing with Personalized Prior Information 🚩KuaiShou ↗️ torch
πŸ“‚ Multi-Task Modeling
52 Arxiv'17 ShareBottom An Overview of Multi-Task Learning in Deep Neural Networks torch
53 KDD'18 MMoE Modeling Task Relationships in Multi-task Learning with Multi-Gate Mixture-of-Experts 🚩Google torch
54 KDD'18 PLE Progressive Layered Extraction (PLE): A Novel Multi-Task Learning (MTL) Model for Personalized Recommendations 🚩Tencent torch
πŸ“‚ Multi-Domain Modeling
55 KDD'23 PEPNet PEPNet: Parameter and Embedding Personalized Network for Infusing with Personalized Prior Information 🚩KuaiShou torch

Benchmarking

We have benchmarked FuxiCTR models on a set of open datasets as follows:

Dependencies

FuxiCTR has the following dependencies:

  • python 3.9+
  • pytorch 1.10+ (required only for torch models)
  • tensorflow 2.1+ (required only for tensorflow models)

Please install other required packages via pip install -r requirements.txt.

Quick Start

  1. Run the demo examples

    Examples are provided in the demo directory to show some basic usage of FuxiCTR. Users can run the examples for quick start and to understand the workflow.

    cd demo
    python example1_build_dataset_to_parquet.py
    python example2_DeepFM_with_parquet_input.py
    
  2. Run a model on tiny data

    Users can easily run each model in the model zoo following the commands below, which is a demo for running DCN. In addition, users can modify the dataset config and model config files to run on their own datasets or with new hyper-parameters. More details can be found in the README.

    cd model_zoo/DCN/DCN_torch
    python run_expid.py --expid DCN_test --gpu 0
    
    # Change `MODEL` according to the target model name
    cd model_zoo/MODEL_PATH
    python run_expid.py --expid MODEL_test --gpu 0
    
  3. Run a model on benchmark datasets (e.g., Criteo)

    Users can follow the benchmark section to get benchmark datasets and running steps for reproducing the existing results. Please see an example here: https://github.com/reczoo/BARS/tree/main/ranking/ctr/DCNv2/DCNv2_criteo_x1

  4. Implement a new model

    The FuxiCTR library is designed to be modularized, so that every component can be overwritten by users according to their needs. In many cases, only the model class needs to be implemented for a new customized model. If data preprocessing or data loader is not directly applicable, one can also overwrite a new one through the core APIs. We show a concrete example which implements our new model FinalMLP that has been recently published in AAAI 2023.

  5. Tune hyper-parameters of a model

    FuxiCTR currently support fast grid search of hyper-parameters of a model using multiple GPUs. The following example shows the grid search of 8 experiments with 4 GPUs.

    cd experiment
    python run_param_tuner.py --config config/DCN_tiny_parquet_tuner_config.yaml --gpu 0 1 2 3 0 1 2 3
    

πŸ”₯ Citation

If you find our code or benchmarks helpful in your research, please cite the following papers.