High-Performance python symbolic regression library based on parallel local search
- Zero hyperparameter tunning.
- Accurate results in seconds or minutes, in contrast to slow GP-based methods.
- Small models size.
- Support for regression, classification and fuzzy math.
- Support 32 and 64 bit floating point arithmetic.
- Work with unprotected version of math operators (log, sqrt, division)
- Speedup search by using feature importances computed from bbox model
Supported instructions | |
---|---|
math | add, sub, mul, div, pdiv, inv, minv, sq2, pow, exp, log, sqrt, cbrt, aq |
goniometric | sin, cos, tan, asin, acos, atan, sinh, cosh, tanh |
other | nop, max, min, abs, floor, ceil, lt, gt, lte, gte |
fuzzy | f_and, f_or, f_xor, f_impl, f_not, f_nand, f_nor, f_nxor, f_nimpl |
C++20 source code available in separate repo sr_core
- AVX2 instructions set(all modern CPU support this)
- numpy
- sklearn
pip install HROCH
Symbolic_Regression_Demo.ipynb
from HROCH import SymbolicRegressor
reg = SymbolicRegressor(num_threads=8, time_limit=60.0, problem='math', precision='f64')
reg.fit(X_train, y_train)
yp = reg.predict(X_test)
- Sklearn compatibility
- Classificators:
- NonlinearLogisticRegressor for a binary classification
- SymbolicClassifier for multiclass classification
- FuzzyRegressor for a special binary classification
- Xi corelation used for filter unrelated features
Older versions
- Public c++ sources
- Commanline interface changed to cpython
- Support for classification score logloss and accuracy
- Support for final transformations:
- ordinal regression
- logistic function
- clipping
- Acess to equations from all paralel hillclimbers
- User defined constants
- Features probability as input parameter
- Custom instructions set
- Parallel hilclimbing parameters
- Improved late acceptance hillclimbing
- First release