XGBoosting Home | About | Contact | Examples

XGBoost Train Model With Custom Objective Function

XGBoost allows users to define custom objective functions, enabling the optimization of models for specific problems or metrics.

This example demonstrates how to train an XGBoost model with a custom objective function that calculates mean squared error (MSE) for a regression task.

import numpy as np
from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error
import xgboost as xgb

# Generate a synthetic regression dataset
X, y = make_regression(n_samples=1000, n_features=10, noise=0.1, random_state=42)

# Split data into train and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Define a custom objective function
def custom_mse(y_true, y_pred):
    gradient = y_pred - y_true
    hessian = np.ones_like(gradient)
    return gradient, hessian

# Initialize an XGBRegressor with the custom objective
model = xgb.XGBRegressor(objective=custom_mse, n_estimators=100, learning_rate=0.1, random_state=42)

# Train the model
model.fit(X_train, y_train)

# Make predictions on the test set
y_pred = model.predict(X_test)

# Calculate MSE on the test set
mse = mean_squared_error(y_test, y_pred)
print(f"Test MSE: {mse:.4f}")

Here’s a step-by-step breakdown:

  1. Import the necessary libraries: NumPy for numerical operations, scikit-learn for generating a synthetic dataset and evaluating the model, and XGBoost for training the model.

  2. Generate a synthetic regression dataset using make_regression from scikit-learn. This function creates a dataset with a specified number of samples, features, and noise level.

  3. Split the generated data into training and test sets using train_test_split from scikit-learn.

  4. Define a custom objective function named custom_mse that takes the true labels (y_true) and predicted labels (y_pred) as input and returns both the error gradient and the hessian values.

  5. Initialize an XGBRegressor with the custom objective function, along with other hyperparameters such as the number of trees (n_estimators), learning rate, and random state for reproducibility.

  6. Train the XGBoost model on the training data using the fit method.

  7. Make predictions on the test set using the trained model’s predict method.

  8. Calculate the mean squared error on the test set predictions using scikit-learn’s mean_squared_error function.

  9. Print the test MSE to demonstrate that the custom objective function was used successfully during training.

Understanding The Custom Objective Function in XGBoost

XGBoost allows the use of custom objective functions, which need to return the gradient and hessian of the loss function.

These are used in the gradient boosting process to update the model.

  1. Gradient: The gradient of the loss function with respect to the predictions. It represents the direction and rate of change of the loss function.
  2. Hessian: The second derivative of the loss function with respect to the predictions. It measures the curvature (or rate of change) of the gradient.

Mean Squared Error (MSE)

The Mean Squared Error (MSE) is a common loss function for regression tasks. It measures the average squared difference between the true values (y_true) and the predicted values (y_pred). Mathematically, it is defined as:

MSE = (1/n) * sum((y_true - y_pred)^2)

Where n is the number of samples.

Gradient and Hessian for MSE

Given the loss function MSE, the gradient and hessian can be derived as follows:

  1. Gradient: The gradient of MSE with respect to the predictions (y_pred) is:
d(MSE)/d(y_pred) = 2 * (y_pred - y_true)

Since XGBoost scales the gradient differently, we can use:

Gradient = y_pred - y_true
  1. Hessian: The hessian (second derivative) of MSE with respect to the predictions is:
d^2(MSE)/d(y_pred)^2 = 2

Since the second derivative of a quadratic function is constant, we use:

Hessian = 1

Custom Objective Function Implementation

Here’s how the custom_mse function incorporates these concepts:

def custom_mse(y_true, y_pred):
    gradient = y_pred - y_true  # Difference between predicted and true values
    hessian = np.ones_like(gradient)  # Array of ones, same shape as gradient
    return gradient, hessian

By returning the gradient and hessian, XGBoost can use these values to update the model parameters in each boosting iteration, aiming to minimize the loss function (MSE in this case).



See Also