XGBoosting Home | About | Contact | Examples

XGBoost Compare "iteration_range" vs "ntree_limit" Parameters

The iteration_range and ntree_limit parameters in XGBoost both control the number of boosting rounds used during prediction.

However, ntree_limit has been deprecated and replaced by iteration_range in newer versions of the library.

Using the ntree_limit parameter will result in an error:

TypeError: XGBClassifier.predict() got an unexpected keyword argument 'ntree_limit'

This example demonstrates the equivalent functionality of these parameters when making predictions with an XGBoost model.

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
import xgboost as xgb

# Generate a synthetic binary classification dataset
X, y = make_classification(n_samples=1000, n_features=10, n_classes=2, random_state=42)

# Split the data into train and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize an XGBoost model and train it on the data
model = xgb.XGBClassifier(n_estimators=100, objective='binary:logistic', eval_metric='logloss')
model.fit(X_train, y_train)

# Make predictions on the test set using "ntree_limit"
# predictions = model.predict(X_test, ntree_limit=50) # TypeError

# Make predictions on the test set using "iteration_range"
predictions = model.predict(X_test, iteration_range=(0, 50))

In this example, we first generate a synthetic binary classification dataset using sklearn.datasets.make_classification. We then split the data into train and test sets.

Next, we initialize an XGBoost classifier with 100 boosting rounds and train it on the data. To compare the ntree_limit and iteration_range parameters, we make predictions on the test set:

Using iteration_range=(0, 50), which specifies the range of boosting rounds to use for prediction, in this case, from round 0 to round 50 (inclusive).

When working with newer versions of XGBoost, we must use iteration_range instead of ntree_limit, as the latter has been deprecated and removed.

By specifying the range of boosting rounds to use during prediction, you can effectively control the model’s complexity and performance.



See Also