This package implements algorithms for nonsmooth optimization. The implementations follow closely the related papers, tuning parameters is advisable to get the best of the methods. Comments and suggestions are more than welcome, get in touch via mail or an issue!
Setup is as follows: in a julia REPL (default environment):
using Pkg
Pkg.update()
Pkg.Registry.add(RegistrySpec(url = "https://github.com/GillesBareilles/OptimRegistry.jl"))
Pkg.add("NonSmoothSolvers")
The generic solvers implemented are:
- Nonsmooth BFGS
Nonsmooth optimization via quasi-Newton methods, Adrian S. Lewis, Michael L. Overton, 2012
- VU bundle
A VU-algorithm for Convex Minimization, R. Mifflin & C. Sagastizábal, Mathematical Programming, 2005
- Gradient Sampling
Gradient Sampling Methods for Nonsmooth Optimization, J.V. Burke, F.E. Curtis, A.S. Lewis, M.L. Overton, L.E.A. Sims, 2018.
A general nonsmooth optimization problem may be implemented following ~NonSmoothProblems.jl~’ interface:
using NonSmoothProblems
using NonSmoothSolvers
struct Simplel1 <: NonSmoothProblems.NonSmoothPb end
F(::Simplel1, x) = norm(x, 1)
∂F_elt(::Simplel1, x) = sign.(x)
is_differentiable(::Simplel1, x) = length(filter(xᵢ -> xᵢ == 0, x)) > 0
Note that ~NonSmoothProblems.jl~ implements some nonsmooth problems (maximum of quadratics, maximum eigenvalue).
n = 10
pb = Simplel1()
x = rand(n)
optparams = OptimizerParams(iterations_limit=20, trace_length=20)
o = GradientSampling(x)
xfinal_gs, tr = optimize!(pb, o, x; optparams)
o = NSBFGS()
xfinal_nsbfgs, tr = optimize!(pb, o, x; optparams)
o = VUbundle()
xfinal_vu, tr = optimize!(pb, o, x; optparams)
Other problems:
- the historical MaxQuad, Numerical Optimisation, BGLS, p. 153 (2nd edition):
pb = MaxQuadBGLS()
x = zeros(10) .+ 1
- a maximum of two quadratics:
pb = MaxQuadAL()
x = zeros(2) .+ 1
The optimize!
function accepts:
- several convergence checkers, possibly defined by the user;
- several policies of data recording and displaying along iterations;
The output is the (final) point generated by the optimization procedure, and a trace object. This object is a vector of OptimizationState
, an immutable object which holds relevant indicators values at the end of an iteration, such as it
, time
, F_x
, norm_step
. The user can add his custom indicators by passing a callback function to optimize!
:
getx(optimizer, optimizerstate) = deepcopy(os.x)
optimstate_extensions = OrderedDict{Symbol, Function}(:x => getx)
o = NSBFGS()
xfinal_nsbfgs, tr = optimize!(pb, o, x; optparams, optimstate_extensions)
Method calls may be (roughly) timed using TimerOutputs
. This functionality is turned on/off by calling TimerOutputs.enable_debug_timings(NonSmoothProblems)
and TimerOutputs.disable_debug_timings(NonSmoothProblems)
.
Profiling of the nsBFGS shows that more than 90% of the time is spent in the line search, calling oracles `F`, `∂F_elt` and `is_differentiable`.
In some cases it faster to compute all these at once rather than separately. “`julia using NonSmoothProblems, NonSmoothSolvers, StatProfilerHTML pb = MaxQuadBGLS(Float64) o = NSBFGS{Float64}() function toto(n) for i = 1:n state=NSS.initial_state(o, ones(10), pb) NSS.update_iterate!(state, o, pb) end end toto(1); @profilehtml toto(10000) “`
Same method as above, with `o = GradientSampling(x)`.
Most of the time is spent with the resolution of the QP.
This is a work in progress. In particular, there may be bugs in algorithms, and the todo list is:
- [ ] list features of
optimize!
;