Skip to content

Latest commit

 

History

History
129 lines (105 loc) · 5.04 KB

readme.org

File metadata and controls

129 lines (105 loc) · 5.04 KB

NonSmoothSolvers.jl

https://img.shields.io/badge/docs-stable-blue.svg https://img.shields.io/badge/docs-dev-blue.svg https://github.com/GillesBareilles/NonSmoothSolvers.jl/actions/workflows/CI.yml/badge.svg?branch=master https://codecov.io/gh/GillesBareilles/NonSmoothSolvers.jl/branch/master/graph/badge.svg https://img.shields.io/badge/code%20style-blue-4495d1.svg

This package implements algorithms for nonsmooth optimization. The implementations follow closely the related papers, tuning parameters is advisable to get the best of the methods. Comments and suggestions are more than welcome, get in touch via mail or an issue!

Setup is as follows: in a julia REPL (default environment):

using Pkg
Pkg.update()
Pkg.Registry.add(RegistrySpec(url = "https://github.com/GillesBareilles/OptimRegistry.jl"))
Pkg.add("NonSmoothSolvers")

Solvers

The generic solvers implemented are:

  • Nonsmooth BFGS

    Nonsmooth optimization via quasi-Newton methods, Adrian S. Lewis, Michael L. Overton, 2012

  • VU bundle

    A VU-algorithm for Convex Minimization, R. Mifflin & C. Sagastizábal, Mathematical Programming, 2005

  • Gradient Sampling

    Gradient Sampling Methods for Nonsmooth Optimization, J.V. Burke, F.E. Curtis, A.S. Lewis, M.L. Overton, L.E.A. Sims, 2018.

Usage

Defining a problem

A general nonsmooth optimization problem may be implemented following ~NonSmoothProblems.jl~’ interface:

using NonSmoothProblems
using NonSmoothSolvers

struct Simplel1 <: NonSmoothProblems.NonSmoothPb end

F(::Simplel1, x) = norm(x, 1)
∂F_elt(::Simplel1, x) = sign.(x)
is_differentiable(::Simplel1, x) = length(filter(xᵢ -> xᵢ == 0, x)) > 0

Note that ~NonSmoothProblems.jl~ implements some nonsmooth problems (maximum of quadratics, maximum eigenvalue).

Running a solver

n = 10
pb = Simplel1()
x = rand(n)
optparams = OptimizerParams(iterations_limit=20, trace_length=20)

o = GradientSampling(x)
xfinal_gs, tr = optimize!(pb, o, x; optparams)

o = NSBFGS()
xfinal_nsbfgs, tr = optimize!(pb, o, x; optparams)

o = VUbundle()
xfinal_vu, tr = optimize!(pb, o, x; optparams)

Other problems:

  • the historical MaxQuad, Numerical Optimisation, BGLS, p. 153 (2nd edition):
pb = MaxQuadBGLS()
x = zeros(10) .+ 1
  • a maximum of two quadratics:
pb = MaxQuadAL()
x = zeros(2) .+ 1

Input parameters, output values

The optimize! function accepts:

  • several convergence checkers, possibly defined by the user;
  • several policies of data recording and displaying along iterations;

The output is the (final) point generated by the optimization procedure, and a trace object. This object is a vector of OptimizationState, an immutable object which holds relevant indicators values at the end of an iteration, such as it, time, F_x, norm_step. The user can add his custom indicators by passing a callback function to optimize!:

getx(optimizer, optimizerstate) = deepcopy(os.x)
optimstate_extensions = OrderedDict{Symbol, Function}(:x => getx)

o = NSBFGS()
xfinal_nsbfgs, tr = optimize!(pb, o, x; optparams, optimstate_extensions)

Notes

Method calls may be (roughly) timed using TimerOutputs. This functionality is turned on/off by calling TimerOutputs.enable_debug_timings(NonSmoothProblems) and TimerOutputs.disable_debug_timings(NonSmoothProblems).

Efficiency of nsBFGS

Profiling of the nsBFGS shows that more than 90% of the time is spent in the line search, calling oracles `F`, `∂F_elt` and `is_differentiable`.

In some cases it faster to compute all these at once rather than separately. “`julia using NonSmoothProblems, NonSmoothSolvers, StatProfilerHTML pb = MaxQuadBGLS(Float64) o = NSBFGS{Float64}() function toto(n) for i = 1:n state=NSS.initial_state(o, ones(10), pb) NSS.update_iterate!(state, o, pb) end end toto(1); @profilehtml toto(10000) “`

Efficiency of gradient sampling

Same method as above, with `o = GradientSampling(x)`.

Most of the time is spent with the resolution of the QP.

Notes

This is a work in progress. In particular, there may be bugs in algorithms, and the todo list is:

  • [ ] list features of optimize!;