XGBoosting Home | About | Contact | Examples

XGBoost for Regression

XGBoost is a powerful tool for regression tasks.

Regression involves predicting continuous output values. XGBoost can perform various types of regression tasks (linear, non-linear) depending on the loss function used (like squared loss for linear regression).

Here’s a quick guide on how to fit an XGBoost model for regression using the scikit-learn API.

# xgboosting.com
# Fit an XGBoost Model for Regression using scikit-learn API
from sklearn.datasets import make_regression
from xgboost import XGBRegressor

# Generate a synthetic dataset with 5 features
X, y = make_regression(n_samples=1000, n_features=5, noise=0.1, random_state=42)

# Initialize XGBRegressor
model = XGBRegressor(objective='reg:squarederror', random_state=42)

# Fit the model to training data
model.fit(X, y)

# Make predictions with the fit model
predictions = model.predict(X)
print(predictions[:5])

In just a few lines of code, you can have a working XGBoost model for regression:

  1. Initialize an XGBRegressor with the appropriate objective (here, 'reg:squarederror' for regression).
  2. Fit the model to your training data using fit().


See Also