Welcome to PGBM’s documentation!
Probabilistic Gradient Boosting Machines (PGBM) is a probabilistic gradient boosting framework in Python based on Torch / scikit-learn, developed by Airlab in Amsterdam. It provides the following advantages over existing frameworks:
Probabilistic regression estimates instead of only point estimates. (example)
Auto-differentiation of custom loss functions. (example, example)
Native GPU-acceleration. (example)
Distributed training for CPU and GPU, across multiple nodes. (examples)
Ability to optimize probabilistic estimates after training for a set of common distributions, without retraining the model. (example)
It is aimed at users interested in solving large-scale tabular probabilistic regression problems, such as probabilistic time series forecasting. For more details, read our paper or check out the examples in our Github repository.