XGBoosting Home | About | Contact | Examples

XGBoost Paper

The XGBoost paper, titled “XGBoost: A Scalable Tree Boosting System,” is authored by Tianqi Chen and Carlos Guestrin.

It was published in 2016 and has since become a foundational paper in the field of machine learning, particularly in the area of ensemble learning using boosted trees.

XGBoost Paper Details

The paper was presented at KDD ‘16: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining in August 2016.



Tree boosting is a highly effective and widely used machine learning method. In this paper, we describe a scalable end-to-end tree boosting system called XGBoost, which is used widely by data scientists to achieve state-of-the-art results on many machine learning challenges. We propose a novel sparsity-aware algorithm for sparse data and weighted quantile sketch for approximate tree learning. More importantly, we provide insights on cache access patterns, data compression and sharding to build a scalable tree boosting system. By combining these insights, XGBoost scales beyond billions of examples using far fewer resources than existing systems.


The paper introduces XGBoost, a scalable end-to-end tree boosting system that is widely used in data science competitions and practical applications. The authors focus on the system’s ability to scale across multiple scenarios and handle various types of data inefficiencies.

Key Challenges Addressed

The paper outlines key challenges in optimizing tree boosting methods, including handling sparse data arising from different sources like missing data, categorical variables, and feature engineering such as one-hot encoding.

System Design

The design of XGBoost is detailed with a focus on:

Scalability and Performance

XGBoost introduces several innovations to ensure scalability and performance:

System Features

The paper discusses XGBoost’s features that make it versatile and powerful:

Experimental Results

The authors present extensive experimental results to demonstrate XGBoost’s effectiveness and superiority in speed and performance compared to other well-known implementations of gradient boosting. The experiments covered various datasets and tasks, highlighting the robustness of XGBoost across different domains.

Use Cases

The paper concludes with discussions on real-world use cases where XGBoost has been effectively employed, including several Kaggle competition wins. These examples underscore the practical impact and relevance of the system in the broader machine learning community.

The XGBoost paper is highly technical yet accessible, and it successfully communicates the innovations and advantages of the XGBoost system, making it a critical reference for researchers and practitioners in machine learning.

See Also