XGBoost is a powerful tool for regression tasks, even with just a single input feature (univariate regression).
Here’s a quick example of how to train an XGBoost model for univariate regression using the scikit-learn API:
# XGBoosting.com
# Fit an XGBoost Model for Univariate Regression using scikit-learn API
from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from xgboost import XGBRegressor
# Generate a synthetic dataset with 1 feature
X, y = make_regression(n_samples=100, n_features=1, noise=0.1, random_state=42)
# Split data into train and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Initialize XGBRegressor
model = XGBRegressor(objective='reg:squarederror', random_state=42)
# Fit the model to training data
model.fit(X_train, y_train)
# Make predictions with the fit model
predictions = model.predict(X_test)
print(predictions[:5])
The key steps:
Generate or load your data, here we create a synthetic dataset using
make_regression
from scikit-learn with just 1 feature.Initialize an
XGBRegressor
with the appropriateobjective
(here,'reg:squarederror'
for regression).Fit the model to your training data using
fit()
.Make predictions on new data using
predict()
.