XGBoosting Home | About | Contact | Examples

Get All XGBoost Model Parameters

Retrieving model parameters is essential for understanding, reproducing, and sharing trained models.

XGBoost provides two methods to get model parameters: get_params() and get_xgb_params().

This example demonstrates how to use these methods to access model parameters.

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split, cross_val_score
from xgboost import XGBClassifier

# Generate synthetic binary classification dataset
X, y = make_classification(n_samples=1000, n_classes=2, random_state=42)

# Split data into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Set model parameters
params = {'objective': 'binary:logistic', 'max_depth': 3, 'eta': 0.1, 'seed': 42}

# Create an instance of the XGBClassifier with the specified parameters
model = XGBClassifier(**params)

# Train the XGBoost model on the full training set
model.fit(X_train, y_train)

# Get model parameters using get_params()
params_sklearn = model.get_params()
print("Model parameters (get_params()):")

# Get model parameters using get_xgb_params()
params_xgb = model.get_xgb_params()
print("\nModel parameters (get_xgb_params()):")

The get_params() method is part of the scikit-learn API and returns a dictionary of the model’s parameters. This method is useful when using XGBoost with scikit-learn’s estimator interface, such as in grid search or pipeline operations.

On the other hand, the get_xgb_params() method is specific to XGBoost and returns a dictionary of the model’s XGBoost-specific parameters. This method is handy when you need to access or modify the underlying XGBoost parameters directly.

The output of the two methods will be different as each API has a slightly different set of model parameters.

The get_params() output includes parameters like objective, max_depth, and eta, which were set explicitly in the params dictionary. It also includes default values for other parameters not explicitly set.

The get_xgb_params() output is similar but includes some XGBoost-specific parameters like tree_method and uses a different naming convention for some parameters (e.g., eta instead of learning_rate).

By using these methods, you can easily access and inspect the parameters of your trained XGBoost models, which is crucial for model interpretation, reproducibility, and sharing.

See Also