XGBoosting Home | About | Contact | Examples

Bayesian Optimization of XGBoost Hyperparameters with Ray Tune

Ray Tune is a scalable hyperparameter optimization library that supports various search algorithms, including Bayesian Optimization.

By leveraging Ray Tune, you can efficiently search for the best hyperparameters for your XGBoost models, potentially leading to improved performance and reduced computational resources compared to manual tuning or exhaustive search methods.

In this example, we’ll demonstrate how to use Ray Tune with the Bayesian Optimization search algorithm to tune XGBoost hyperparameters for a synthetic classification dataset.

First, make sure you have Ray Tune installed:

pip install 'ray[tune]'

Now, let’s see how to use Ray Tune for XGBoost hyperparameter optimization:

import numpy as np
from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
from xgboost import XGBClassifier
from ray import tune

# Generate a synthetic classification dataset
X, y = make_classification(n_samples=1000, n_features=10, n_informative=5, n_redundant=2, random_state=42)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Define the objective function
def objective(config):
    model = XGBClassifier(
        n_estimators=config["n_estimators"],
        max_depth=config["max_depth"],
        learning_rate=config["learning_rate"],
        subsample=config["subsample"],
        colsample_bytree=config["colsample_bytree"],
        random_state=42,
    )
    model.fit(X_train, y_train)
    y_pred = model.predict(X_test)
    accuracy = accuracy_score(y_test, y_pred)
    return {"accuracy": accuracy}

# Define the search space
config = {
    "n_estimators": tune.randint(50, 500),
    "max_depth": tune.randint(2, 10),
    "learning_rate": tune.loguniform(1e-3, 1e-1),
    "subsample": tune.uniform(0.5, 1.0),
    "colsample_bytree": tune.uniform(0.5, 1.0),
}

# Create a Ray Tune experiment
bayesopt = tune.suggest.BayesOptSearch(metric="accuracy", mode="max")
analysis = tune.run(
    objective,
    config=config,
    search_alg=bayesopt,
    num_samples=50,
    resources_per_trial={"cpu": 2},
)

# Get the best hyperparameters and corresponding accuracy
best_config = analysis.get_best_config(metric="accuracy", mode="max")
best_accuracy = analysis.best_result["accuracy"]

print(f"Best hyperparameters: {best_config}")
print(f"Best accuracy: {best_accuracy:.4f}")

Bayesian Optimization is a sequential model-based optimization approach that constructs a probabilistic model of the objective function and uses this model to select the next set of hyperparameters to evaluate. By leveraging the information from previous evaluations, Bayesian Optimization can often find better hyperparameters in fewer iterations compared to random or grid search.

Ray Tune simplifies the process of hyperparameter tuning by providing a unified interface for defining the search space, objective function, and search algorithm. It supports distributed execution, allowing you to parallelize the search process across multiple CPUs or GPUs, making it suitable for large-scale hyperparameter tuning tasks.

In this example, we defined the objective function that takes hyperparameters as input, trains an XGBoost classifier, and returns the model’s accuracy on the test set. We then specified the search space for the hyperparameters using Ray Tune’s functions like tune.randint and tune.loguniform.

Next, we created a Ray Tune experiment with the BayesOptSearch algorithm, specifying the metric to optimize (accuracy) and the optimization mode (maximization). We ran the experiment with 50 trials and 2 CPUs per trial.

Finally, we retrieved the best hyperparameters and the corresponding accuracy from the analysis object.

By using Ray Tune with Bayesian Optimization, you can efficiently find high-performing hyperparameters for your XGBoost models, potentially saving significant computational resources compared to manual tuning or exhaustive search methods. Ray Tune’s integration with popular machine learning frameworks like XGBoost, PyTorch, and TensorFlow makes it a versatile tool for hyperparameter optimization across various domains and tasks.



See Also