Skip to content

An extension of XGBoost to probabilistic forecasting

License

Notifications You must be signed in to change notification settings

maxfield-green/XGBoostLSS

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

XGBoostLSS - An extension of XGBoost to probabilistic forecasting

We propose a new framework of XGBoost that predicts the entire conditional distribution of univariate and multivariate responses. In particular, XGBoostLSS models all moments of a parametric distribution, i.e., mean, location, scale and shape (LSS), instead of the conditional mean only. Choosing from a wide range of continuous, discrete, and mixed discrete-continuous distribution, modelling and predicting the entire conditional distribution greatly enhances the flexibility of XGBoost, as it allows to create probabilistic forecasts from which prediction intervals and quantiles of interest can be derived.

Features

✅ Estimation of all distributional parameters.
✅ Multi-target regression allows modelling of multivariate responses and their dependencies.
✅ Automatic derivation of Gradients and Hessian of all distributional parameters using PyTorch.
✅ Automated hyper-parameter search, including pruning, is done via Optuna.
✅ The output of XGBoostLSS is explained using SHapley Additive exPlanations.
✅ XGBoostLSS provides full compatibility with all the features and functionality of XGBoost.
✅ XGBoostLSS is available in Python.

News

💥 [2023-05-26] Release of v0.2.1. See the release notes for an overview.
💥 [2023-05-18] Release of v0.2.0. See the release notes for an overview.
💥 [2022-10-14] XGBoostLSS now supports multi-target regression. (Currently available via Py-BoostLSS).
💥 [2022-01-03] XGBoostLSS now supports estimation of the Gamma distribution.
💥 [2021-12-22] XGBoostLSS now supports estimating the full predictive distribution via Expectile Regression.
💥 [2021-12-20] XGBoostLSS is initialized with suitable starting values to improve convergence of estimation.
💥 [2021-12-04] XGBoostLSS now supports automatic derivation of Gradients and Hessians.
💥 [2021-12-02] XGBoostLSS now supports pruning during hyperparameter optimization.
💥 [2021-11-14] XGBoostLSS v0.1.0 is released!

Installation

To install XGBoostLSS, please first run

pip install git+https://github.com/StatMixedML/XGBoostLSS.git

Then, to install the shap-dependency, run

pip install git+https://github.com/dsgibbons/shap.git

How to use

We refer to the example section for example notebooks.

Available Distributions

XGBoostLSS currently supports the following PyTorch distributions.

Distribution Usage Type Support Number of Parameters
Beta Beta() Continuous
(Univariate)
$y \in (0, 1)$ 2
Cauchy Cauchy() Continuous
(Univariate)
$y \in (-\infty,\infty)$ 2
Expectile Expectile() Continuous
(Univariate)
$y \in (-\infty,\infty)$ Number of expectiles
Gamma Gamma() Continuous
(Univariate)
$y \in (0, \infty)$ 2
Gaussian Gaussian() Continuous
(Univariate)
$y \in (-\infty,\infty)$ 2
Gumbel Gumbel() Continuous
(Univariate)
$y \in (-\infty,\infty)$ 2
Laplace Laplace() Continuous
(Univariate)
$y \in (-\infty,\infty)$ 2
LogNormal LogNormal() Continuous
(Univariate)
$y \in (0,\infty)$ 2
Negative Binomial NegativeBinomial() Discrete Count
(Univariate)
$y \in (0, 1, 2, 3, ...)$ 2
Poisson Poisson() Discrete Count
(Univariate)
$y \in (0, 1, 2, 3, ...)$ 1
Student-T StudentT() Continuous
(Univariate)
$y \in (-\infty,\infty)$ 3
Weibull Weibull() Continuous
(Univariate)
$y \in [0, \infty)$ 2

Some Notes

Stabilization

Since XGBoostLSS updates the parameter estimates by optimizing Gradients and Hessians, it is important that these are comparable in magnitude for all distributional parameters. Due to variability regarding the ranges, the estimation of Gradients and Hessians might become unstable so that XGBoostLSS might not converge or might converge very slowly. To mitigate these effects, we have implemented a stabilization of Gradients and Hessians.

For improved convergence, an alternative approach is to standardize the (continuous) response variable, such as dividing it by 100 (e.g., y/100). This approach proves especially valuable when the response range significantly differs from that of Gradients and Hessians. Nevertheless, it is essential to carefully evaluate and apply both the built-in stabilization and response standardization techniques in consideration of the specific dataset at hand.

Runtime

Since XGBoostLSS updates all distributional parameters simultaneously, it requires training [number of iterations] * [number of distributional parameters] trees. Hence, the runtime of XGBoostLSS is generally slightly higher as compared to XGBoost, which requires training [number of iterations] trees only.

Feedback

We encourage you to provide feedback on how to enhance XGBoostLSS or request the implementation of additional distributions by opening a new issue.

Reference Paper

März, Alexander (2022): Multi-Target XGBoostLSS Regression.
März, A. and Kneib, T.: (2022) Distributional Gradient Boosting Machines.
März, Alexander (2019): XGBoostLSS - An extension of XGBoost to probabilistic forecasting.

About

An extension of XGBoost to probabilistic forecasting

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%