Ax is a Python library developed by Facebook for adaptive experimentation. It uses Bayesian optimization to efficiently tune hyperparameters, making it a valuable tool for optimizing complex machine learning models like XGBoost.
Bayesian optimization intelligently selects the next set of hyperparameters to evaluate based on the results of previous evaluations, enabling it to find better hyperparameters in fewer iterations compared to traditional methods like grid search. Ax’s implementation of Bayesian optimization can handle complex search spaces and integrates seamlessly with PyTorch.
Here’s an example of how to use Ax to optimize XGBoost hyperparameters for a binary classification problem:
First, install Ax using pip:
pip install ax-platform
Then, use Ax to define the search space and optimize the hyperparameters:
import numpy as np
from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from xgboost import XGBClassifier
from ax.service.managed_loop import optimize
# Generate synthetic binary classification dataset
X, y = make_classification(n_samples=1000, n_classes=2, n_features=10, random_state=42)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Define the objective function to maximize
def xgb_eval(parameterization):
params = {
'max_depth': parameterization.get('max_depth'),
'learning_rate': parameterization.get('learning_rate'),
'subsample': parameterization.get('subsample'),
'colsample_bytree': parameterization.get('colsample_bytree'),
'n_estimators': 100,
'objective': 'binary:logistic',
'random_state': 42,
}
model = XGBClassifier(**params)
model.fit(X_train, y_train)
auc = model.score(X_test, y_test)
return auc
# Define the search space
parameters = [
{"name": "max_depth", "type": "range", "bounds": [3, 10]},
{"name": "learning_rate", "type": "range", "bounds": [0.01, 0.3]},
{"name": "subsample", "type": "range", "bounds": [0.5, 1.0]},
{"name": "colsample_bytree", "type": "range", "bounds": [0.5, 1.0]},
]
# Optimize the hyperparameters
best_parameters, best_values, experiment, model = optimize(
parameters=parameters,
evaluation_function=xgb_eval,
objective_name='auc',
total_trials=30,
random_seed=42,
)
# Print the best hyperparameters and AUC
print(f"Best hyperparameters: {best_parameters}")
print(f"Best AUC: {best_values}")
In this example:
We generate a synthetic binary classification dataset using scikit-learn’s
make_classification
function and split it into train and test sets.We define an objective function
xgb_eval
that takes hyperparameters as input, creates an XGBoost classifier with those hyperparameters, fits it on the training data, and returns the model’s AUC score on the test data.We define the search space using Ax’s format, specifying the hyperparameters and their ranges.
We use Ax’s
optimize
function to find the best hyperparameters, specifying the search space, objective function, metric to optimize (AUC), and the total number of trials.After optimization, we print the best hyperparameters and the corresponding best AUC score.
By leveraging Bayesian optimization with Ax, we can efficiently find high-performing hyperparameters for XGBoost, potentially saving significant computational resources compared to exhaustive search methods. Ax’s ability to handle complex search spaces and its integration with PyTorch make it a powerful tool for optimizing machine learning models.