XGBoosting Home | About | Contact | Examples

Configure XGBoost "min_child_weight" Parameter

The min_child_weight parameter in XGBoost controls the minimum sum of instance weight needed in a child node.

By adjusting min_child_weight, you can influence the model’s complexity and its ability to generalize.

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from xgboost import XGBClassifier

# Generate synthetic data
X, y = make_classification(n_samples=1000, n_features=20, n_informative=2, n_redundant=10, random_state=42)

# Split the dataset into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize the XGBoost classifier with a higher min_child_weight value
model = XGBClassifier(min_child_weight=5, eval_metric='logloss')

# Fit the model
model.fit(X_train, y_train)

# Make predictions
predictions = model.predict(X_test)

Understanding the “min_child_weight” Parameter

The min_child_weight parameter determines the minimum sum of instance weight (hessian) needed in a child node for a split to be made.

It is a regularization parameter that can help control overfitting by preventing the creation of overly complex trees. min_child_weight accepts non-negative values, and the default value in XGBoost is 1.

Choosing the Right “min_child_weight” Value

The value of min_child_weight affects the model’s complexity and its propensity to overfit:

When setting min_child_weight, consider the trade-off between model complexity and performance:

Practical Tips



See Also