XGBoosting Home | About | Contact | Examples

Configure XGBoost "verbosity" Parameter

The verbosity parameter in XGBoost controls the level of messages printed during training, which can be invaluable for monitoring progress and debugging issues.

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from xgboost import XGBClassifier

# Generate synthetic data
X, y = make_classification(n_samples=1000, n_features=20, n_informative=2, n_redundant=10, random_state=42)

# Split the dataset into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize the XGBoost classifier with verbosity set to 3
model = XGBClassifier(verbosity=3, eval_metric='logloss')

# Fit the model
model.fit(X_train, y_train)

# Make predictions
predictions = model.predict(X_test)

Example output (your output may vary)

[09:25:56] ======== Monitor (0): HostSketchContainer ========
[09:25:56] AllReduce: 0.000138s, 1 calls @ 138us

[09:25:56] MakeCuts: 0.000212s, 1 calls @ 212us

[09:25:56] DEBUG: /Users/runner/work/xgboost/xgboost/src/gbm/gbtree.cc:130: Using tree method: 0
[09:25:56] ======== Monitor (0): Learner ========
[09:25:56] Configure: 0.000211s, 1 calls @ 211us

[09:25:56] EvalOneIter: 0.0002s, 100 calls @ 200us

[09:25:56] GetGradient: 0.000734s, 100 calls @ 734us

[09:25:56] PredictRaw: 5e-05s, 100 calls @ 50us

[09:25:56] UpdateOneIter: 0.048202s, 100 calls @ 48202us

[09:25:56] ======== Monitor (0): GBTree ========
[09:25:56] BoostNewTrees: 0.046797s, 100 calls @ 46797us

[09:25:56] CommitModel: 3.3e-05s, 100 calls @ 33us

[09:25:56] ======== Monitor (0): HistUpdater ========
[09:25:56] BuildHistogram: 0.01288s, 441 calls @ 12880us

[09:25:56] EvaluateSplits: 0.021687s, 541 calls @ 21687us

[09:25:56] InitData: 0.00103s, 100 calls @ 1030us

[09:25:56] InitRoot: 0.006048s, 100 calls @ 6048us

[09:25:56] LeafPartition: 1.2e-05s, 100 calls @ 12us

[09:25:56] UpdatePosition: 0.00524s, 483 calls @ 5240us

[09:25:56] UpdatePredictionCache: 0.000849s, 100 calls @ 849us

[09:25:56] UpdateTree: 0.045625s, 100 calls @ 45625us

[09:25:56] DEBUG: /Users/runner/work/xgboost/xgboost/src/gbm/gbtree.cc:130: Using tree method: 0

Understanding the “verbosity” Parameter

The verbosity parameter determines the amount of information XGBoost will print during the training process. It accepts integer values ranging from 0 to 3, each corresponding to a different level of detail:

Choosing the Right Verbosity Level

The appropriate verbosity level depends on your specific use case and the stage of your model development process:

Keep in mind that higher verbosity levels may slightly slow down the training process due to the additional information being printed.

Practical Tips



See Also