XGBoosting Home | About | Contact | Examples

Configure XGBoost "max_depth" Parameter

The max_depth parameter in XGBoost controls the maximum depth of a tree in the model. By adjusting max_depth, you can influence the model’s complexity and its ability to generalize.

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from xgboost import XGBClassifier

# Generate synthetic data
X, y = make_classification(n_samples=1000, n_features=20, n_informative=2, n_redundant=10, random_state=42)

# Split the dataset into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize the XGBoost classifier with a lower max_depth value
model = XGBClassifier(max_depth=3, eval_metric='logloss')

# Fit the model
model.fit(X_train, y_train)

# Make predictions
predictions = model.predict(X_test)

Understanding the “max_depth” Parameter

The max_depth parameter determines the maximum depth of each tree in the XGBoost model. It is a regularization parameter that can help control overfitting by limiting the model’s complexity. max_depth accepts positive integer values, and the default value in XGBoost is 6.

Choosing the Right “max_depth” Value

The value of max_depth affects the model’s complexity and its propensity to overfit:

When setting max_depth, consider the trade-off between model complexity and performance:

Practical Tips



See Also