XGBoosting Home | About | Contact | Examples

Objective

Helpful examples for configuring the learning objective when training XGBoost models.

The XGBoost objective or learning task specifies the goal of the model, such as regression, classification, or ranking, by defining the loss function that the algorithm aims to minimize during training to optimize the model’s predictive performance.

ExamplesTags
Configure XGBoost "binary:hinge" Objective
Configure XGBoost "binary:logistic" Objective
Configure XGBoost "binary:logitraw" Objective
Configure XGBoost "count:poisson" Objective
Configure XGBoost "multi:softmax" Objective
Configure XGBoost "multi:softprob" Objective
Configure XGBoost "objective" Parameter
Configure XGBoost "rank:map" Objective
Configure XGBoost "rank:ndcg" Objective
Configure XGBoost "rank:pairwise" Objective
Configure XGBoost "reg:absoluteerror" Objective (mean absolute error)
Configure XGBoost "reg:gamma" Objective
Configure XGBoost "reg:linear" Objective
Configure XGBoost "reg:logistic" Objective
Configure XGBoost "reg:pseudohubererror" Objective
Configure XGBoost "reg:quantileerror" Objective
Configure XGBoost "reg:squarederror" Objective
Configure XGBoost "reg:squaredlogerror" Objective
Configure XGBoost "reg:tweedie" Objective
Configure XGBoost "survival:aft" Objective
Configure XGBoost "survival:cox" Objective
Configure XGBoost Objective "binary:logistic" vs "binary:logitraw"
Configure XGBoost Objective "multi:softmax" vs "multi:softprob"
Configure XGBoost Objective "reg:logistic" vs "binary:logistic"
Configure XGBoost Objective "survival:cox" vs "survival:aft"
XGBoost Default "objective" Parameter For Learning Tasks
XGBoost Train Model With Custom Objective Function