Releases: heal-research/pyoperon
Releases · heal-research/pyoperon
pyoperon-0.4.0
What's Changed
This release is based on Operon rev. 4a93f98
- minor bugfix related to lexicographical sorting in NSGA2
- best order sort (DOI) implementation, now Operon contains all well-known non-dominated sorting algorithms
- refactored dispatch table using generic backend interface (based on mdspan), support for other math backends (Blaze, Eve, etc.)
- improved likelihoods (Gaussian, Poisson) which can also be used as objective functions
- many other small improvements and fixes
- support for SGD and L-BFGS algorithms for parameter tuning
The scikit-learn
interface has been updated with some fixes and additional parameters:
local_iterations
parameter has been renamed tooptimizer_iterations
optimizer
parameter acceptslm
,sgd
orlbfgs
values to choose the optimization methodoptimizer_likelihood
parameter specifies the likelihood used by the optimizeroptimizer_batch_size
controls the batch size for gradient descentlocal_search_probability
controls the probability of applying local search to an individuallamarckian_probability
controls the probability of writing optimized coefficients back into the genotype- parameters
add_model_scale_term
andadd_model_intercept_term
control linear scaling of the final model uncertainty
parameter specifies the variance of the error (taken into account inside the likelihood)sgd_update_rule
,sgd_learning_rate
,sgd_beta
,sgd_beta2
,sgd_epsilon
can be used to configure the SGD algorithmmodel_selection_criterion
parameter can be used to specify which model from the final pareto front is returned (NSGA2)
pyoperon-0.3.6
Changelog
This release is based on Operon rev. 88a15c3 and includes the following features:
- hard-crafted reverse-mode automatic differentiation module for symbolic expression trees, with much better runtime performance
- the ability to optimize all tree node coefficients via nonlinear least squares (previously, only leaf nodes were possible)
- slightly faster interpreter performance (+5-10%)
- a selection of new evaluators
AggregateEvaluator
: aggregates multiple objectives into a single scalar (min, max, median, mean, harmonic mean, sum)BayesianInformationCriterionEvaluator
: computes the value of the Bayesian Information Criterion (BIC) for a symbolic regression modelAkaikeInformationCriterionEvaluator
: computes the value of the Akaike Information Criterion (AIC) for a symbolic regression modelMinimumDescriptionLengthEvaluator
: computes the Minimum Description Length (MDL) of a symbolic regression model
- various other fixes and improvements
The scikit-learn module now defaults to using the minimum description length to select the best model from the Pareto front. This is configurable with choices between: MSE, BIC, AIC, MDL