xgbtune

Automated XGBoost tunning


Keywords
gradient-boosting, hyperparameter-optimization, hyperparameter-tuning, machine-learning, parameter-tuning, tuning-parameters, xgboost
License
MIT
Install
pip install xgbtune==1.1.0

Documentation

XGBTune

Github WorkFlows Documentation Status

XGBTune is a library for automated XGBoost model tuning. Tuning an XGBoost model is as simple as a single function call.

Get Started

from xgbtune import tune_xgb_model

params, round_count = tune_xgb_model(params, x_train, y_train)

Install

XGBTune is available on PyPi and can be installed with pip:

pip install xgbtune

Tuning steps

The tuning is done in the following steps:

  • compute best round
  • tune max_depth and min_child_weight
  • tune gamma
  • re-compute best round
  • tune subsample and colsample_bytree
  • fine tune subsample and colsample_bytree
  • tune alpha and lambda
  • tune seed

This steps can be repeated several times. By default, two passes are done.