XGBoosting Home | About | Contact | Examples

XGBoost get_num_boosting_rounds() Method

The get_num_boosting_rounds() method in XGBoost allows you to retrieve the actual number of boosting rounds performed during model training.

Accessing this information can be useful for understanding the model’s training process and making informed decisions about model configuration, especially if the number of boosting rounds (e.g. n_estimators) was not set explicitly.

This example demonstrates how to use get_num_boosting_rounds() and interpret the returned value.

from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split
from xgboost import XGBClassifier

# Load the Breast Cancer dataset
data = load_breast_cancer()
X, y = data.data, data.target

# Split the data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Train an XGBoost model with an unspecified number of boosting rounds
model = XGBClassifier(learning_rate=0.1, max_depth=3, random_state=42)
model.fit(X_train, y_train)

# Retrieve the number of boosting rounds performed
num_rounds = model.get_num_boosting_rounds()

# Print the result
print(f"Number of boosting rounds performed: {num_rounds}")

The get_num_boosting_rounds() method returns the number of boosting rounds configured during training.

This may be the explicit value of n_estimators or the default value.

Note, if early stopping is used, the round with the best performance can be retrieved via the best_iteration property, not the get_num_boosting_rounds() method.

Use the number of rounds as an indicator of the model’s complexity. A higher number of rounds suggests a more complex model, which may be more prone to overfitting. Conversely, a lower number of rounds may indicate a simpler model that could potentially underfit the data.

By examining the number of boosting rounds performed, you can gain insights into the model’s training dynamics and make informed decisions about adjusting the n_estimators parameter or other related hyperparameters to optimize model performance.

Keep in mind that the interpretation of the number of boosting rounds should be considered alongside other factors, such as the model’s performance metrics and the specific characteristics of your dataset.



See Also