- Add
load_ngboost_modelcompatibility loader for models saved with scikit-learn < 1.3 and loaded under newer scikit-learn versions (issue #389) - Backfill
missing_go_to_leftduring load and restore trees to standard sklearnTreeinstances after deserialization - Add targeted regression tests for old-style tree-node pickles and helper-level tests for
Y_from_censoredand compatibility loading paths intests/test_helpers.py
- Add SymPy-powered distribution factory support for defining LogScore and NGBoost distributions from symbolic expressions or
sympy.statsdistributions - Add new built-in distributions via the factory, including Beta, Beta-Bernoulli, Beta-Binomial, and Logit-Normal
- Add support for Python 3.14 and update CI matrix to test Python 3.14
- Bump development version to
0.5.9dev
- Fix numpy2 natural gradient compatibility
- Support for Python 3.13
- Update to dev dependencies
- Add support for Weibull and HalfNormal Distributions
- Upgrade sklearn > 1.6
- Update to partial fit to respect validation data
- Allow NAN as input
- Poetry update
- Support for Numpy 2.0
- Value error fix
- Linting updates
- Adds support for NormalFixedMean distribution
- Updates to makefile for easier publishing
- Drops support for python 3.7 and 3.8
- Now supports Python 3.11 and 3.12
- Fixed issue with np.bool
- Optimized memory usage in pred-dist
- Removed declared pandas dependency
- Significant improvements to run times on tests during development
- Minor enhancements to github actions
- Fix deprecated numpy type alias. This was causing a warning with NumPy >=1.20 and an error with NumPy >=1.24
- Remove pandas as a declared dependency
NGBoost now includes a new partial_fit method that allows for incremental learning. This method appends new base models to the existing ones, which can be useful when new data becomes available over time or when the data is too large to fit in memory all at once.
The partial_fit method takes similar parameters to the fit method, including predictors X, outcomes Y, and validation sets X_val and Y_val. It also supports custom weights for the training and validation sets, as well as early stopping and custom loss monitoring.
Please note that the partial_fit method is not yet fully tested and may not work as expected in all cases. Use it with caution and thoroughly test its behavior in your specific use case before relying on it in production.
- Added support for the gamma distribution
- Added sklearn support to
set_params - Fixed off-by-one issue for max trees
- Upgraded version of
blackformatter to 22.8.0