XGBoosting Home | About | Contact | Examples

Configure XGBoost "gamma" Parameter

The gamma parameter in XGBoost controls the minimum loss reduction required to make a split on a leaf node of the tree.

By adjusting gamma, you can influence the model’s complexity and its ability to generalize.

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from xgboost import XGBClassifier

# Generate synthetic data
X, y = make_classification(n_samples=1000, n_features=20, n_informative=2, n_redundant=10, random_state=42)

# Split the dataset into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize the XGBoost classifier with a higher gamma value
model = XGBClassifier(gamma=0.5, eval_metric='logloss')

# Fit the model
model.fit(X_train, y_train)

# Make predictions
predictions = model.predict(X_test)

An alias for the gamma parameter is min_split_loss.

For example:

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from xgboost import XGBClassifier

# Generate synthetic data
X, y = make_classification(n_samples=1000, n_features=20, n_informative=2, n_redundant=10, random_state=42)

# Split the dataset into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize the XGBoost classifier with a higher gamma value
model = XGBClassifier(min_split_loss=0.5, eval_metric='logloss')

# Fit the model
model.fit(X_train, y_train)

# Make predictions
predictions = model.predict(X_test)

Understanding the “gamma” Parameter

The gamma parameter is a regularization term that governs the minimum loss reduction needed for a split to occur.

In other words, it specifies the minimum improvement in the model’s objective function that a new partition must bring to justify its creation. gamma is a non-negative value, and higher values make the model more conservative.

The default value of gamma in XGBoost is 0.

Choosing the Right “gamma” Value

The value of gamma affects the model’s complexity and its propensity to overfit:

When setting gamma, consider the trade-off between model complexity and performance:

Practical Tips



See Also