XGBoost Documentation
XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.
Contents
- Installation Guide
- Building From Source
- Get Started with XGBoost
- XGBoost Tutorials
- Introduction to Boosted Trees
- Introduction to Model IO
- Learning to Rank
- DART booster
- Monotonic Constraints
- Feature Interaction Constraints
- Survival Analysis with Accelerated Failure Time
- Categorical Data
- Multiple Outputs
- Random Forests(TM) in XGBoost
- Distributed XGBoost on Kubernetes
- Distributed XGBoost with XGBoost4J-Spark
- Distributed XGBoost with XGBoost4J-Spark-GPU
- Distributed XGBoost with Dask
- Distributed XGBoost with PySpark
- Distributed XGBoost with Ray
- Using XGBoost External Memory Version
- C API Tutorial
- Text Input Format of DMatrix
- Notes on Parameter Tuning
- Custom Objective and Evaluation Metric
- Intercept
- Privacy Preserving Inference with Concrete ML
- Frequently Asked Questions
- XGBoost User Forum
- GPU Support
- XGBoost Parameters
- Prediction
- Tree Methods
- Python Package
- Python Package Introduction
- Using the Scikit-Learn Estimator Interface
- Python API Reference
- Callback Functions
- Model
- XGBoost Python Feature Walkthrough
- XGBoost Dask Feature Walkthrough
- Survival Analysis Walkthrough
- GPU Acceleration Demo
- Using XGBoost with RAPIDS Memory Manager (RMM) plugin (EXPERIMENTAL)
- R Package
- JVM Package
- Ruby Package
- Swift Package
- Julia Package
- C Package
- C++ Interface
- CLI Interface
- Contribute to XGBoost