XGBoosting Home | About | Contact | Examples

XGBoost Announcement

The XGBoost project was announced on Kaggle.com as part of the “Higgs Boson Machine Learning Challenge” in 2014.

Kaggle is an online platform that hosts data science competitions, provides datasets, and offers a community for data scientists and machine learning practitioners to collaborate and share knowledge.

The official first release of XGBoost (v0.1) was on March 27, 2014.

The Higgs Boson Machine Learning Challenge started about one and a half months later on May 13, 2014.

The announcement for XGBoost was made by Bing Xu in May 2014 in a forum post titled “Public Starting Guide to Get above 3.60 AMS score”.

Hi all,

Tianqi Chen (crowwork) has made a fast and friendly boosting tree library XGBoost. By using XGBoost and run a script, you can train a model with 3.60 AMS score in about 42 seconds.

The demo is at: https://github.com/tqchen/xgboost/tree/master/demo/kaggle-higgs , you can just type ./run.sh to get the score after you build it.

XGBoost is as easy to use as scikit-learn. And on my computer with Core i5-4670K CPU, the speed test.py (boosting 10 trees) shows:

sklearn.GBM costs: 77.5 seconds XGBoost with 1 thread costs: 11.0 seconds XGBoost with 2 thread costs: 5.85 seconds XGBoost with 4 thread costs: 3.40 seconds

Like competitions held before, public sharing method will boost the performance of all teams and reduce barriers for new learners. We hope all of us can learn and enjoy more during the competition.

BTW, Don’t forget to star XGBoost ;)

Update:

20th, May, 2014: If you are using XGBoost 0.2, please pull the newest version. The binary classification will run incorrectly if scale_pos_weight is not set. New version fixed this problem. So please update. We are sorry for the mistake and please update it.

XGBoost became an important part of the competition with the lead author Tianqi Chen fielding many questions about XGBoost in the forums and rapidly developing the project.

These discussions remain an excellent source of information about XGBoost.

At the completion of the challenge, the lead authors of XGBoost (Tianqi Chen and Tong He) were singled out for a special award in a forum post titled “Winner announcement”.

In addition, documented software was scrutinized, and the special HEP meets ML award is given to :

crowwork (Tianqi Chen and Tong He)

They have developed XGBoost https://github.com/tqchen/xgboost and made it available to other participants early, and it was indeed used by many of them ; while not giving the very best score, it appears to be an excellent compromise between performance and simplicity, which makes it a promising improvements to tools currently used by high energy physicists.

The team will be invited at CERN in 2015 for a workshop (being organised) where machine learning techniques application to high energy physics, in particular as they emerged in this challenge, will be discussed further.



See Also