XGBoosting Home | About | Contact | Examples

Parallel

Helpful examples for parallelism with XGBoost models.

ExamplesTags
Configure XGBoost "n_jobs" Parameter
Configure XGBoost "nthread" Parameter
Thread-Safe Predictions with XGBoost
Tune XGBoost "n_jobs" Parameter
Tune XGBoost "nthread" Parameter
Verify CPU Core Utilization During XGBoost Model Training
XGBoost Benchmark Model Training Time
XGBoost Compare "n_jobs" vs "nthread" Parameters
XGBoost Configure "n_jobs" for Grid Search
XGBoost Configure "n_jobs" for Random Search
XGBoost Configure "OMP_NUM_THREADS" for Inference
XGBoost Configure "OMP_NUM_THREADS" for Model Training
XGBoost Configure The Number of BLAS Threads
XGBoost Configure The Number of OpenMP Threads
XGBoost CPU Usage Below 100% During Training
XGBoost Model Training is Mostly Deterministic (Reproducibility)
XGBoost Multi-Core Training and Prediction
XGBoost Multiple CPUs for Training and Prediction
XGBoost Multithreaded Training and Prediction
XGBoost Parallel Prediction With a Process Pool (multiprocessing)
XGBoost Parallel Prediction With a Process Pool and Shared Memory
XGBoost Parallel Prediction With a Thread Pool (threading)
XGBoost Releases GIL During Inference (prediction)
XGBoost Releases GIL During Training
XGBoost Releases the Global Interpreter Lock (GIL)
XGBoost Report Execution Time
XGBoost Single-Threaded Training and Prediction (no threads)
XGBoost Train Multiple Models in Parallel (multiprocessing)
XGBoost Train Multiple Models in Parallel (threading)
XGBoost Train Multiple Models in Parallel with Joblib
XGBoost Training Time of Max Depth vs Boosting Rounds
XGBoost Training Time of Threads vs Boosting Rounds
XGBoost Training Time of Tree Method vs Boosting Rounds