Welcome to the developer documentation for SigOpt. If you have a question you can’t answer, feel free to contact us!
Welcome to the new SigOpt docs! If you're looking for the classic SigOpt documentation then you can find that here. Otherwise, happy optimizing!

Introduction to SigOpt XGBoost Integration

XGBoost is a machine learning modeling framework for gradient-boosted decision trees that was first released by Tianqi Chen in 2014. It is a popular choice in the machine learning community for a wide variety of use cases such as data science competitions, benchmarking, and in production machine learning systems.

SigOpt has worked with many customers who leverage SigOpt's experiment management and optimization platform to track and optimize their XGBoost models. Based on customer feedback and our research and development, we integrated XGBoost's learning API into the SigOpt client to offer XGBoost users a seamless path to:

  • Automatically track hyperparameters, metadata, checkpoints, and metrics in a SigOpt Run with minimal code.
  • Automatically tune hyperparameters in a SigOpt Experiment with model-aware optimization techniques.


You can install the sigopt.xgboost client library with the following command. The integration client requires a version of xgboost>=1.3.0 in the same Python virtual environment.

$ pip install 'sigopt[xgboost]'

Note that this command will install xgboost and numpy as part of its dependencies.

If this is your first time installing the client, you need to follow the set up instruction to configure your SigOpt API Token.

$ sigopt config

Now your are ready to use the sigopt.xgboost library, which gives you the following capabilities.

  1. Automated tracking of XGBoost training with
  2. Automated tuning of XGBoost models withsigopt.xgboost.experiment.