XGBoosting Home | About | Contact | Examples

Parameters

Helpful examples for configuring XGBoost model parameters (hyperparameters).

They are parameters in the programming sense (e.g. arguments to functions), but hyperparameters in the model sense (e.g. influence model behavior). Technically, model parameters are the trees and weights found by the learning algorithm.

ExamplesTags
Configure XGBoost "alpha" Parameter
Configure XGBoost "binary:hinge" Objective
Configure XGBoost "binary:logistic" Objective
Configure XGBoost "binary:logitraw" Objective
Configure XGBoost "booster" Parameter
Configure XGBoost "colsample_bylevel" Parameter
Configure XGBoost "colsample_bynode" Parameter
Configure XGBoost "colsample_bytree" Parameter
Configure XGBoost "count:poisson" Objective
Configure XGBoost "device" Parameter
Configure XGBoost "early_stopping_rounds" Parameter
Configure XGBoost "enable_categorical" Parameter
Configure XGBoost "eta" Parameter
Configure XGBoost "eval_metric" Parameter
Configure XGBoost "eval_set" Parameter
Configure XGBoost "gamma" Parameter
Configure XGBoost "grow_policy" Parameter
Configure XGBoost "importance_type" Parameter
Configure XGBoost "interaction_constraints" Parameter
Configure XGBoost "iteration_range" Parameter for predict()
Configure XGBoost "lambda" Parameter
Configure XGBoost "learning_rate" Parameter
Configure XGBoost "max_bin" Parameter
Configure XGBoost "max_cat_threshold" Parameter
Configure XGBoost "max_cat_to_onehot" Parameter
Configure XGBoost "max_delta_step" Parameter
Configure XGBoost "max_depth" Parameter
Configure XGBoost "max_leaves" Parameter
Configure XGBoost "min_child_weight" Parameter
Configure XGBoost "min_split_loss" Parameter
Configure XGBoost "missing" Parameter
Configure XGBoost "monotone_constraints" Parameter
Configure XGBoost "multi_strategy" Parameter
Configure XGBoost "multi:softmax" Objective
Configure XGBoost "multi:softprob" Objective
Configure XGBoost "n_estimators" Parameter
Configure XGBoost "n_jobs" Parameter
Configure XGBoost "nthread" Parameter
Configure XGBoost "num_boost_round" Parameter
Configure XGBoost "num_class" Parameter
Configure XGBoost "num_parallel_tree" Parameter
Configure XGBoost "objective" Parameter
Configure XGBoost "random_state" Parameter
Configure XGBoost "rank:map" Objective
Configure XGBoost "rank:ndcg" Objective
Configure XGBoost "rank:pairwise" Objective
Configure XGBoost "reg_alpha" Parameter
Configure XGBoost "reg_lambda" Parameter
Configure XGBoost "reg:absoluteerror" Objective (mean absolute error)
Configure XGBoost "reg:gamma" Objective
Configure XGBoost "reg:linear" Objective
Configure XGBoost "reg:logistic" Objective
Configure XGBoost "reg:pseudohubererror" Objective
Configure XGBoost "reg:quantileerror" Objective
Configure XGBoost "reg:squarederror" Objective
Configure XGBoost "reg:squaredlogerror" Objective
Configure XGBoost "reg:tweedie" Objective
Configure XGBoost "sampling_method" Parameter
Configure XGBoost "seed" Parameter
Configure XGBoost "subsample" Parameter
Configure XGBoost "survival:aft" Objective
Configure XGBoost "survival:cox" Objective
Configure XGBoost "tree_method" Parameter
Configure XGBoost "use_label_encoder" Parameter
Configure XGBoost "validate_parameters" Parameter
Configure XGBoost "verbosity" Parameter
Configure XGBoost "xgb_model" Parameter
Configure XGBoost Approximate Tree Method (tree_method=approx)
Configure XGBoost Automatic Tree Method (tree_method=auto)
Configure XGBoost Dart Booster
Configure XGBoost Early Stopping Regularization
Configure XGBoost Early Stopping Tolerance
Configure XGBoost Early Stopping Via Callback
Configure XGBoost Exact Tree Method (tree_method=exact)
Configure XGBoost Histogram Tree Method (tree_method=hist)
Configure XGBoost L1 Regularization
Configure XGBoost L2 Regularization
Configure XGBoost Linear Booster (gblinear)
Configure XGBoost Model with Parameters Defined in a dict
Configure XGBoost Objective "binary:logistic" vs "binary:logitraw"
Configure XGBoost Objective "multi:softmax" vs "multi:softprob"
Configure XGBoost Objective "reg:logistic" vs "binary:logistic"
Configure XGBoost Objective "survival:cox" vs "survival:aft"
Configure XGBoost Tree Booster (gbtree)
Get All XGBoost Model Parameters
Tune "num_boost_round" Parameter to xgboost.train()
Tune XGBoost "alpha" Parameter
Tune XGBoost "booster" Parameter
Tune XGBoost "colsample_bylevel" Parameter
Tune XGBoost "colsample_bynode" Parameter
Tune XGBoost "colsample_bytree" Parameter
Tune XGBoost "eta" Parameter
Tune XGBoost "gamma" Parameter
Tune XGBoost "grow_policy" Parameter
Tune XGBoost "learning_rate" Parameter
Tune XGBoost "max_bin" Parameter
Tune XGBoost "max_delta_step" Parameter
Tune XGBoost "max_depth" Parameter
Tune XGBoost "max_leaves" Parameter
Tune XGBoost "min_child_weight" Parameter
Tune XGBoost "min_split_loss" Parameter
Tune XGBoost "n_estimators" Parameter
Tune XGBoost "n_jobs" Parameter
Tune XGBoost "nthread" Parameter
Tune XGBoost "num_parallel_tree" Parameter
Tune XGBoost "reg_alpha" Parameter
Tune XGBoost "reg_lambda" Parameter
Tune XGBoost "subsample" Parameter
Tune XGBoost "tree_method" Parameter
XGBoost "best_iteration" Property
XGBoost "best_score" Property
XGBoost "evals_result()" Method
XGBoost "gbtree" vs "gblinear" booster
XGBoost "scale_pos_weight" Parameter Unused For Regression
XGBoost "scale_pos_weight" vs "sample_weight" for Imbalanced Classification
XGBoost Compare "alpha" vs "reg_alpha" Parameters
XGBoost Compare "gamma" vs "min_split_loss" Parameters
XGBoost Compare "iteration_range" vs "ntree_limit" Parameters
XGBoost Compare "lambda" vs "reg_lambda" Parameters
XGBoost Compare "learning_rate" vs "eta" Parameters
XGBoost Compare "max_cat_threshold" vs "max_cat_to_onehot" Parameters
XGBoost Compare "n_jobs" vs "nthread" Parameters
XGBoost Compare "num_boost_round" vs "n_estimators" Parameters
XGBoost Compare "seed" vs "random_state" Parameters
XGBoost Configure Multiple Metrics With "eval_metric" Parameter
XGBoost Configure "aft-nloglik" Eval Metric
XGBoost Configure "auc" Eval Metric
XGBoost Configure "aucpr" Eval Metric
XGBoost Configure "class_weight" Parameter for Imbalanced Classification
XGBoost Configure "cox-nloglik" Eval Metric
XGBoost Configure "error" Eval Metric
XGBoost Configure "error@t" Eval Metric
XGBoost Configure "gamma-deviance" Eval Metric
XGBoost Configure "gamma-nloglik" Eval Metric
XGBoost Configure "interval-regression-accuracy" Eval Metric
XGBoost Configure "logloss" Eval Metric
XGBoost Configure "mae" Eval Metric
XGBoost Configure "map" Eval Metric
XGBoost Configure "mape" Eval Metric
XGBoost Configure "max_delta_step" Parameter for Imbalanced Classification
XGBoost Configure "merror" Eval Metric
XGBoost Configure "mlogloss" Eval Metric
XGBoost Configure "mphe" Eval Metric
XGBoost Configure "ndcg" Eval Metric
XGBoost Configure "poisson-nloglik" Eval Metric
XGBoost Configure "pre" Eval Metric
XGBoost Configure "rmse" Eval Metric
XGBoost Configure "rmsle" Eval Metric
XGBoost Configure "sample_weight" Parameter for Imbalanced Classification
XGBoost Configure "scale_pos_weight" Parameter
XGBoost Configure "tweedie-nloglik" Eval Metric
XGboost Configure xgboost.cv() Parameters
XGboost Configure xgboost.train() Parameters
XGBoost Default "objective" Parameter For Learning Tasks
XGBoost Default Evaluation Metric "eval_metric" For Objectives
XGBoost Default Parameters
XGBoost get_booster()
XGBoost get_num_boosting_rounds() Method
XGBoost get_params() Method
XGBoost get_xgb_params() Method
XGBoost Linear Booster "coef_" Property
XGBoost Linear Booster "feature_selector" Parameter
XGBoost Linear Booster "intercept_" Property
XGBoost Linear Booster "top_k" Parameter
XGBoost Linear Booster "updater" Parameter
XGBoost Regularization Techniques
XGBoost Sensitivity Analysis