Jump to content

XGBoost: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
AnomieBOT (talk | contribs)
m Dating maintenance tags: {{Clarify}} {{Explain}}
Samba599 (talk | contribs)
Link to ML and Competitive programming
Tags: Mobile edit Mobile web edit
 
Line 17: Line 17:
'''XGBoost'''<ref name="source-code">{{cite web |url=https://github.com/dmlc/xgboost |title=GitHub project webpage |website=[[GitHub]] |date=June 2022 |access-date=2016-04-05 |archive-date=2021-04-01 |archive-url=https://web.archive.org/web/20210401110045/https://github.com/dmlc/xgboost |url-status=live }}</ref> (eXtreme Gradient Boosting) is an [[Open-source software|open-source]] [[Library (computing)|software library]] which provides a [[regularization (mathematics)|regularizing]] [[gradient boosting]] framework for [[C++]], [[Java (programming language)|Java]], [[Python (programming language)|Python]],<ref name="xgboost-python">{{Cite web|url=https://pypi.python.org/pypi/xgboost/|title=Python Package Index PYPI: xgboost|access-date=2016-08-01|archive-date=2017-08-23|archive-url=https://web.archive.org/web/20170823013244/https://pypi.python.org/pypi/xgboost|url-status=live}}</ref> [[R (programming language)|R]],<ref name="xgboost-cran">{{Cite web|url=https://cran.r-project.org/web/packages/xgboost/index.html|title=CRAN package xgboost|access-date=2016-08-01|archive-date=2018-10-26|archive-url=https://web.archive.org/web/20181026172734/https://cran.r-project.org/web/packages/xgboost/index.html|url-status=live}}</ref> [[Julia (programming language)|Julia]],<ref name="xgboost-julia">{{Cite web|url=http://pkg.julialang.org/?pkg=XGBoost#XGBoost|title=Julia package listing xgboost|access-date=2016-08-01|archive-date=2016-08-18|archive-url=https://web.archive.org/web/20160818144844/http://pkg.julialang.org/?pkg=XGBoost#XGBoost|url-status=dead}}</ref> [[Perl (programming language)|Perl]],<ref name="xgboost-perl">{{Cite web|url=https://metacpan.org/pod/AI::XGBoost|title=CPAN module AI::XGBoost|access-date=2020-02-09|archive-date=2020-03-28|archive-url=https://web.archive.org/web/20200328204421/https://metacpan.org/pod/AI::XGBoost|url-status=live}}</ref> and [[Scala (programming language)|Scala]]. It works on [[Linux]], [[Microsoft Windows]],<ref name="xgboost-windows">{{Cite web|title=Installing XGBoost for Anaconda in Windows|website=[[IBM]]|url=https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_For_Anaconda_on_Windows?lang=en|access-date=2016-08-01|archive-date=2018-05-08|archive-url=https://web.archive.org/web/20180508185540/https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_For_Anaconda_on_Windows?lang=en|url-status=live}}</ref> and [[macOS]].<ref name="xgboost-macos">{{Cite web|title=Installing XGBoost on Mac OSX|website=[[IBM]]|url=https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_on_Mac_OSX?lang=en|access-date=2016-08-01|archive-date=2018-05-08|archive-url=https://web.archive.org/web/20180508185643/https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_on_Mac_OSX?lang=en|url-status=live}}</ref> From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". It runs on a single machine, as well as the distributed processing frameworks [[Apache Hadoop]], [[Apache Spark]], [[Apache Flink]], and [[Dask (software)|Dask]].<ref name="Dask-docs">{{Cite web|title=Dask Homepage|url=https://www.dask.org/|access-date=2021-07-15|archive-date=2022-09-14|archive-url=https://web.archive.org/web/20220914010952/https://www.dask.org/|url-status=live}}</ref><ref>{{Cite web|title=Distributed XGBoost with Dask — xgboost 1.5.0-dev documentation|url=https://xgboost.readthedocs.io/en/latest/tutorials/dask.html|access-date=2021-07-15|website=xgboost.readthedocs.io|archive-date=2022-06-04|archive-url=https://web.archive.org/web/20220604151406/https://xgboost.readthedocs.io/en/latest/tutorials/dask.html|url-status=live}}</ref>
'''XGBoost'''<ref name="source-code">{{cite web |url=https://github.com/dmlc/xgboost |title=GitHub project webpage |website=[[GitHub]] |date=June 2022 |access-date=2016-04-05 |archive-date=2021-04-01 |archive-url=https://web.archive.org/web/20210401110045/https://github.com/dmlc/xgboost |url-status=live }}</ref> (eXtreme Gradient Boosting) is an [[Open-source software|open-source]] [[Library (computing)|software library]] which provides a [[regularization (mathematics)|regularizing]] [[gradient boosting]] framework for [[C++]], [[Java (programming language)|Java]], [[Python (programming language)|Python]],<ref name="xgboost-python">{{Cite web|url=https://pypi.python.org/pypi/xgboost/|title=Python Package Index PYPI: xgboost|access-date=2016-08-01|archive-date=2017-08-23|archive-url=https://web.archive.org/web/20170823013244/https://pypi.python.org/pypi/xgboost|url-status=live}}</ref> [[R (programming language)|R]],<ref name="xgboost-cran">{{Cite web|url=https://cran.r-project.org/web/packages/xgboost/index.html|title=CRAN package xgboost|access-date=2016-08-01|archive-date=2018-10-26|archive-url=https://web.archive.org/web/20181026172734/https://cran.r-project.org/web/packages/xgboost/index.html|url-status=live}}</ref> [[Julia (programming language)|Julia]],<ref name="xgboost-julia">{{Cite web|url=http://pkg.julialang.org/?pkg=XGBoost#XGBoost|title=Julia package listing xgboost|access-date=2016-08-01|archive-date=2016-08-18|archive-url=https://web.archive.org/web/20160818144844/http://pkg.julialang.org/?pkg=XGBoost#XGBoost|url-status=dead}}</ref> [[Perl (programming language)|Perl]],<ref name="xgboost-perl">{{Cite web|url=https://metacpan.org/pod/AI::XGBoost|title=CPAN module AI::XGBoost|access-date=2020-02-09|archive-date=2020-03-28|archive-url=https://web.archive.org/web/20200328204421/https://metacpan.org/pod/AI::XGBoost|url-status=live}}</ref> and [[Scala (programming language)|Scala]]. It works on [[Linux]], [[Microsoft Windows]],<ref name="xgboost-windows">{{Cite web|title=Installing XGBoost for Anaconda in Windows|website=[[IBM]]|url=https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_For_Anaconda_on_Windows?lang=en|access-date=2016-08-01|archive-date=2018-05-08|archive-url=https://web.archive.org/web/20180508185540/https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_For_Anaconda_on_Windows?lang=en|url-status=live}}</ref> and [[macOS]].<ref name="xgboost-macos">{{Cite web|title=Installing XGBoost on Mac OSX|website=[[IBM]]|url=https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_on_Mac_OSX?lang=en|access-date=2016-08-01|archive-date=2018-05-08|archive-url=https://web.archive.org/web/20180508185643/https://www.ibm.com/developerworks/community/blogs/jfp/entry/Installing_XGBoost_on_Mac_OSX?lang=en|url-status=live}}</ref> From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". It runs on a single machine, as well as the distributed processing frameworks [[Apache Hadoop]], [[Apache Spark]], [[Apache Flink]], and [[Dask (software)|Dask]].<ref name="Dask-docs">{{Cite web|title=Dask Homepage|url=https://www.dask.org/|access-date=2021-07-15|archive-date=2022-09-14|archive-url=https://web.archive.org/web/20220914010952/https://www.dask.org/|url-status=live}}</ref><ref>{{Cite web|title=Distributed XGBoost with Dask — xgboost 1.5.0-dev documentation|url=https://xgboost.readthedocs.io/en/latest/tutorials/dask.html|access-date=2021-07-15|website=xgboost.readthedocs.io|archive-date=2022-06-04|archive-url=https://web.archive.org/web/20220604151406/https://xgboost.readthedocs.io/en/latest/tutorials/dask.html|url-status=live}}</ref>


XGBoost gained much popularity and attention in the mid-2010s as the algorithm of choice for many winning teams of machine learning competitions.<ref name="xgboost-competition-winners">{{Cite web|title=XGBoost - ML winning solutions (incomplete list)|website=[[GitHub]]|url=https://github.com/dmlc/xgboost/tree/master/demo#machine-learning-challenge-winning-solutions|access-date=2016-08-01|archive-date=2017-08-24|archive-url=https://web.archive.org/web/20170824150805/https://github.com/dmlc/xgboost/tree/master/demo#machine-learning-challenge-winning-solutions|url-status=live}}</ref>
XGBoost gained much popularity and attention in the mid-2010s as the algorithm of choice for many winning teams of [[machine learning]] [[Competitive programming|competitions]].<ref name="xgboost-competition-winners">{{Cite web|title=XGBoost - ML winning solutions (incomplete list)|website=[[GitHub]]|url=https://github.com/dmlc/xgboost/tree/master/demo#machine-learning-challenge-winning-solutions|access-date=2016-08-01|archive-date=2017-08-24|archive-url=https://web.archive.org/web/20170824150805/https://github.com/dmlc/xgboost/tree/master/demo#machine-learning-challenge-winning-solutions|url-status=live}}</ref>


==History==
==History==

Latest revision as of 16:21, 20 October 2024

XGBoost
Developer(s)The XGBoost Contributors
Initial releaseMarch 27, 2014; 10 years ago (2014-03-27)
Stable release
2.1.1[1] Edit this on Wikidata / 30 July 2024; 2 months ago (30 July 2024)
Repository
Written inC++
Operating systemLinux, macOS, Microsoft Windows
TypeMachine learning
LicenseApache License 2.0
Websitexgboost.ai

XGBoost[2] (eXtreme Gradient Boosting) is an open-source software library which provides a regularizing gradient boosting framework for C++, Java, Python,[3] R,[4] Julia,[5] Perl,[6] and Scala. It works on Linux, Microsoft Windows,[7] and macOS.[8] From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". It runs on a single machine, as well as the distributed processing frameworks Apache Hadoop, Apache Spark, Apache Flink, and Dask.[9][10]

XGBoost gained much popularity and attention in the mid-2010s as the algorithm of choice for many winning teams of machine learning competitions.[11]

History

[edit]

XGBoost initially started as a research project by Tianqi Chen[12] as part of the Distributed (Deep) Machine Learning Community (DMLC) group. Initially, it began as a terminal application which could be configured using a libsvm configuration file. It became well known in the ML competition circles after its use in the winning solution of the Higgs Machine Learning Challenge. Soon after, the Python and R packages were built, and XGBoost now has package implementations for Java, Scala, Julia, Perl, and other languages. This brought the library to more developers and contributed to its popularity among the Kaggle community, where it has been used for a large number of competitions.[11]

It was soon integrated with a number of other packages making it easier to use in their respective communities. It has now been integrated with scikit-learn for Python users and with the caret package for R users. It can also be integrated into Data Flow frameworks like Apache Spark, Apache Hadoop, and Apache Flink using the abstracted Rabit[13] and XGBoost4J.[14] XGBoost is also available on OpenCL for FPGAs.[15] An efficient, scalable implementation of XGBoost has been published by Tianqi Chen and Carlos Guestrin.[16]

While the XGBoost model often achieves higher accuracy than a single decision tree, it sacrifices the intrinsic interpretability of decision trees.  For example, following the path that a decision tree takes to make its decision is trivial and self-explained, but following the paths of hundreds or thousands of trees is much harder.

Features

[edit]

Salient features of XGBoost which make it different from other gradient boosting algorithms include:[17][18][16]

  • Clever penalization of trees
  • A proportional shrinking of leaf nodes
  • Newton Boosting
  • Extra randomization parameter
  • Implementation on single, distributed systems and out-of-core computation
  • Automatic Feature selection [citation needed]
  • Theoretically justified weighted quantile sketching for efficient computation
  • Parallel tree structure boosting with sparsity
  • Efficient cacheable block structure for decision tree training

The algorithm

[edit]

XGBoost works as Newton-Raphson in function space unlike gradient boosting that works as gradient descent in function space, a second order Taylor approximation is used in the loss function to make the connection to Newton Raphson method.

A generic unregularized XGBoost algorithm is:

Input: training set , a differentiable loss function , a number of weak learners and a learning rate .

Algorithm:

  1. Initialize model with a constant value: [further explanation needed]
  2. For m = 1 to M:
    1. Compute the 'gradients' and 'hessians':[clarification needed]
    2. Fit a base learner (or weak learner, e.g. tree) using the training set [clarification needed] by solving the optimization problem below: [clarification needed]
    3. Update the model:
  3. Output

Awards

[edit]
  • John Chambers Award (2016)[19]
  • High Energy Physics meets Machine Learning award (HEP meets ML) (2016)[20]

See also

[edit]

References

[edit]
  1. ^ "Release 2.1.1". 30 July 2024. Retrieved 27 August 2024.
  2. ^ "GitHub project webpage". GitHub. June 2022. Archived from the original on 2021-04-01. Retrieved 2016-04-05.
  3. ^ "Python Package Index PYPI: xgboost". Archived from the original on 2017-08-23. Retrieved 2016-08-01.
  4. ^ "CRAN package xgboost". Archived from the original on 2018-10-26. Retrieved 2016-08-01.
  5. ^ "Julia package listing xgboost". Archived from the original on 2016-08-18. Retrieved 2016-08-01.
  6. ^ "CPAN module AI::XGBoost". Archived from the original on 2020-03-28. Retrieved 2020-02-09.
  7. ^ "Installing XGBoost for Anaconda in Windows". IBM. Archived from the original on 2018-05-08. Retrieved 2016-08-01.
  8. ^ "Installing XGBoost on Mac OSX". IBM. Archived from the original on 2018-05-08. Retrieved 2016-08-01.
  9. ^ "Dask Homepage". Archived from the original on 2022-09-14. Retrieved 2021-07-15.
  10. ^ "Distributed XGBoost with Dask — xgboost 1.5.0-dev documentation". xgboost.readthedocs.io. Archived from the original on 2022-06-04. Retrieved 2021-07-15.
  11. ^ a b "XGBoost - ML winning solutions (incomplete list)". GitHub. Archived from the original on 2017-08-24. Retrieved 2016-08-01.
  12. ^ "Story and Lessons behind the evolution of XGBoost". Archived from the original on 2016-08-07. Retrieved 2016-08-01.
  13. ^ "Rabit - Reliable Allreduce and Broadcast Interface". GitHub. Archived from the original on 2018-06-11. Retrieved 2016-08-01.
  14. ^ "XGBoost4J". Archived from the original on 2018-05-08. Retrieved 2016-08-01.
  15. ^ "XGBoost on FPGAs". GitHub. Archived from the original on 2020-09-13. Retrieved 2019-08-01.
  16. ^ a b Chen, Tianqi; Guestrin, Carlos (2016). "XGBoost: A Scalable Tree Boosting System". In Krishnapuram, Balaji; Shah, Mohak; Smola, Alexander J.; Aggarwal, Charu C.; Shen, Dou; Rastogi, Rajeev (eds.). Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, August 13-17, 2016. ACM. pp. 785–794. arXiv:1603.02754. doi:10.1145/2939672.2939785. ISBN 9781450342322. S2CID 4650265.
  17. ^ Gandhi, Rohith (2019-05-24). "Gradient Boosting and XGBoost". Medium. Archived from the original on 2020-03-28. Retrieved 2020-01-04.
  18. ^ "Tree Boosting With XGBoost – Why Does XGBoost Win "Every" Machine Learning Competition?". Synced. 2017-10-22. Archived from the original on 2020-03-28. Retrieved 2020-01-04.
  19. ^ "John Chambers Award Previous Winners". Archived from the original on 2017-07-31. Retrieved 2016-08-01.
  20. ^ "HEP meets ML Award". Archived from the original on 2018-05-08. Retrieved 2016-08-01.