XGBoosting Home | About | Contact | Examples

One-Hot Encode Categorical Features for XGBoost

XGBoost, like many machine learning algorithms, cannot directly handle categorical features.

One-hot encoding is a technique used to convert categorical features into a format that allows the algorithm to better understand and utilize the information.

This example demonstrates how to perform one-hot encoding on categorical features using scikit-learn’s OneHotEncoder before training an XGBoost model.

import pandas as pd
from sklearn.preprocessing import OneHotEncoder
from sklearn.compose import ColumnTransformer
from sklearn.model_selection import train_test_split
from xgboost import XGBClassifier

# Create example dataset
data = {'Color': ['Red', 'Blue', 'Green', 'Red', 'Blue'],
        'Size': ['Small', 'Medium', 'Large', 'Medium', 'Small'],
        'Label': [0, 1, 0, 1, 0]}
df = pd.DataFrame(data)

# Define categorical features
categorical_features = ['Color', 'Size']

# Create ColumnTransformer for one-hot encoding
transformer = ColumnTransformer(
    transformers=[('encoder', OneHotEncoder(), categorical_features)],

# Perform one-hot encoding
X = transformer.fit_transform(df.drop('Label', axis=1))

# Extract target variable
y = df['Label']

# Initialize and train XGBoost model
model = XGBClassifier(random_state=42)
model.fit(X, y)

# Make predictions on new data
new_data = pd.DataFrame({'Color': ['Green'], 'Size': ['Medium']})
new_data_encoded = transformer.transform(new_data)
prediction = model.predict(new_data_encoded)

Here’s a step-by-step breakdown:

  1. We create a small example dataset df with categorical features ‘Color’ and ‘Size’.
  2. We define the categorical features in the categorical_features list.
  3. We create a ColumnTransformer object with OneHotEncoder to apply one-hot encoding to the categorical features. The remainder='passthrough' argument ensures that any non-specified columns are left unchanged.
  4. We use the fit_transform() method of the ColumnTransformer to perform one-hot encoding on the feature columns (excluding the ‘Label’ column) and store the result in X.
  5. We extract the target variable y from the ‘Label’ column.
  6. We initialize an XGBoost classifier and train it on the one-hot encoded data.
  7. To make predictions on new data, we first use the transform() method of the ColumnTransformer to one-hot encode the new data, ensuring it has the same columns as the training data, then pass it to the trained model.

By using scikit-learn’s OneHotEncoder and ColumnTransformer, we can efficiently perform one-hot encoding on categorical features before training an XGBoost model, allowing the algorithm to effectively utilize the information contained in these features.

See Also