XGBoosting Home | About | Contact | Examples

XGBoost for Binary Classification

XGBoost can be used for binary classification tasks.

Binary classification involves predicting one of two classes. The output is typically modeled with a logistic function to return a probability.

Here’s a quick example on how to fit an XGBoost model for binary classification using the scikit-learn API.

# XGBoosting.com
# Fit an XGBoost Model for Binary Classification using scikit-learn API
from sklearn.datasets import make_classification
from xgboost import XGBClassifier

# Generate a synthetic dataset with 2 classes
X, y = make_classification(n_samples=1000, n_features=20, n_informative=2, n_redundant=10, random_state=42)

# Initialize XGBClassifier
model = XGBClassifier(objective='binary:logistic', random_state=42)

# Fit the model to training data
model.fit(X, y)

# Make predictions with the fit model
predictions = model.predict(X)
print(predictions[:5])

In just a few lines of code, you can have a working XGBoost model:

  1. Initialize an XGBClassifier with the appropriate objective (here, 'binary:logistic' for binary classification).
  2. Fit the model to your training data using fit().
  3. Make predictions with your model by calling predict().

You can learn more about how to use the XGBClassifier in the example:



See Also