Releases: SciNim/numericalnim
meshgrid fix
What's Changed
- avoid name collisions of
meshgrid
by @HugoGranstrom in #42
Full Changelog: v0.8.8...v0.8.9
Extrapolation
The 1D interpolation methods now support extrapolation using these methods:
Constant
: Set all points outside the range of the interpolator toextrapValue
.Edge
: Use the value of the left/right edge.Linear
: Uses linear extrapolation using the two points closest to the edge.Native
(default): Uses the native method of the interpolator to extrapolate. For Linear1D it will be a linear extrapolation, and for Cubic and Hermite splines it will be cubic extrapolation.Error
: Raises anValueError
ifx
is outside the range.
These are passed in as an argument to eval
and derivEval
:
let valEdge = interp.eval(x, Edge)
let valConstant = interp.eval(x, Constant, NaN)
`levmarq` uncertainties + CI Docs
levmarq
now acceptsyError
.paramUncertainties
allows you to calculate the uncertainties of fitted parameters.chi2
test added
What's Changed
- build docs in CI when pushing to master by @HugoGranstrom in #35
- add chi2 + add uncertainties to levmarq by @HugoGranstrom in #36
Full Changelog: v0.8.5...v0.8.6
Fix rbf bug
Radial basis function interpolation
With radial basis function interpolation, numericalnim
finally gets an interpolation method which works on scattered data in arbitrary dimensions!
Basic usage:
let interp = newRbf(points, values)
let result = interp.eval(evalPoints)
What's Changed
- Radial Basis functions by @HugoGranstrom in #33
Full Changelog: v0.8.3...v0.8.4
Export LineSearchCriterion
What's Changed
- export linesearch and test it by @HugoGranstrom in #32
Full Changelog: v0.8.2...v0.8.3
Fix Nim CI - strictEffects
Fix Nim CI
Fixes #29
Optimization has joined the chat
Multi-variate optimization and differentiation has been introduced.
numericalnim/differentiate
offerstensorGradient(f, x)
which calculates the gradient off
w.r.tx
using finite differences,tensorJacobian
(returns the transpose of the gradient),tensorHessian
,mixedDerivative
. It also providescheckGradient(f, analyticGrad, x, tol)
to verify that the analytic gradient is correct by comparing it to the finite difference approximation.numericalnim/optimize
now has several multi-variate optimization methods:steepestDescent
newton
bfgs
lbfgs
- They all have the function signatures like:
where
proc bfgs*[U; T: not Tensor](f: proc(x: Tensor[U]): T, x0: Tensor[U], options: OptimOptions[U, StandardOptions] = bfgsOptions[U](), analyticGradient: proc(x: Tensor[U]): Tensor[T] = nil): Tensor[U]
f
is the function to be minimized,x0
is the starting guess,options
contain options like tolerance (each method has it own options type which can be created by for examplelbfgsOptions
ornewtonOptions
),analyticGradient
can be supplied to avoid having to do finite difference approximations of the derivatives. - There are 4 different line search methods supported and those are set in the
options
:Armijo, Wolfe, WolfeStrong, NoLineSearch
. levmarq
: non-linear least-square optimizerproc levmarq*[U; T: not Tensor](f: proc(params: Tensor[U], x: U): T, params0: Tensor[U], xData: Tensor[U], yData: Tensor[T], options: OptimOptions[U, LevmarqOptions[U]] = levmarqOptions[U]()): Tensor[U]
f
is the function you want to fit to the parameters inparam
andx
is the value to evaluate the function at.params0
is the initial guess for the parametersxData
is a 1D Tensor with the x points andyData
is a 1D Tensor with the y points.options
can be created usinglevmarqOptions
.- Returns the final parameters
Note: There are basic tests to ensure these methods converge for simple problems, but they are not tested on more complex problems and should be considered experimental until more tests have been done. Please try them out, but don't rely on them for anything important for now. Also, the API isn't set in stone yet so expect that it may change in future versions.
Fix nim CI
adds the task nimCI
which is to to run by the Nim CI