XGBoosting Home | About | Contact | Examples

Configure XGBoost "booster" Parameter

Configuring the booster parameter in XGBoost can substantially affect your model’s performance.

This tip discusses the three available options (gbtree, gblinear, and dart) and provides guidance on choosing the right booster type for different machine learning scenarios.

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from xgboost import XGBClassifier

# Generate synthetic data
X, y = make_classification(n_samples=1000, n_features=20, n_informative=2, n_redundant=10, random_state=42)

# Split the dataset into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize the XGBoost classifier with the 'gbtree' booster
model = XGBClassifier(booster='gbtree', eval_metric='logloss')

# Fit the model
model.fit(X_train, y_train)

# Make predictions
predictions = model.predict(X_test)

Understanding the “booster” Parameter

The booster parameter in XGBoost is crucial for defining the type of model you will train. It has three settings:

Choosing the Right Booster

Practical Tips



See Also