Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduce mutating functions for Problem member functions #53

Merged
merged 108 commits into from
Mar 9, 2021

Conversation

kellertuer
Copy link
Member

@kellertuer kellertuer commented Jan 24, 2021

This is a major rework of the inner structure of Manopt.jl to support mutating problem functions, most prominently gradient!(X,x).

  • introduce a parameter for the problem to indicate whether its member functions are mutating or not.
  • introduce a unified gradient!! function and encapsulate all of this in get_gradient/get_gradient!
  • introduce fallbacks for Circle/PositiveNumbers since then gradient!! never mutates.
  • check where get_gradient! should be introduced
  • refactor hessian to hessian!!
  • update ApproximateHessian to act as a normal hessian (mutating or allocating) function
  • refactor subgradient to subgradient!!
  • refactor proxes to proximal_maps!!
  • refactor to stochastic_gradient!!
  • refactor all 4 functions from PrimalDualProbem to !!.
  • update gradient and prox function library to provide efficient mutating and non mutating variants. Check that all grad/prox/cost hast M as their first parameter.
    • adjoint differentials
    • bezier curves
    • costs
    • differentials
    • Jacobi fields
    • proximal maps
  • improve test coverage
  • carefully only load random functions and non-mutating-specials if manifolds is loaded.
  • check/update documentation
  • check tutorials and examples.

This then solves #52.

The problem parameter might even later be used to indicate whether the gradient/hessian is Riemannian or Euclidean, though their conversions have to first be more established.

Maybe a small benchmark of the two new methods (providing either a allocation or a mutating gradient) would be nice, too.

@kellertuer kellertuer changed the title [WIP] Introduce mutating functions for the gradient and the hessian [WIP] Introduce mutating functions for Problem member functions Jan 24, 2021
@codecov
Copy link

codecov bot commented Jan 24, 2021

Codecov Report

Merging #53 (e7ca28d) into master (7b58373) will increase coverage by 0.37%.
The diff coverage is 99.66%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #53      +/-   ##
==========================================
+ Coverage   99.28%   99.66%   +0.37%     
==========================================
  Files          41       45       +4     
  Lines        1968     2360     +392     
==========================================
+ Hits         1954     2352     +398     
+ Misses         14        8       -6     
Impacted Files Coverage Δ
src/data/artificialDataFunctions.jl 100.00% <ø> (ø)
src/functions/bezier_curves.jl 100.00% <ø> (ø)
src/functions/costs.jl 100.00% <ø> (ø)
src/helpers/errorMeasures.jl 100.00% <ø> (ø)
src/solvers/solver.jl 100.00% <ø> (ø)
src/plans/stepsize.jl 95.58% <93.10%> (+0.03%) ⬆️
src/solvers/trust_regions.jl 98.48% <98.07%> (+1.87%) ⬆️
src/Manopt.jl 100.00% <100.00%> (ø)
src/functions/Jacobi_fields.jl 100.00% <100.00%> (ø)
src/functions/adjoint_differentials.jl 100.00% <100.00%> (ø)
... and 35 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7b58373...e7ca28d. Read the comment docs.

kellertuer and others added 24 commits January 25, 2021 08:25
# Conflicts:
#	Project.toml
#	src/plans/gradient_plan.jl
#	src/plans/hessian_plan.jl
#	src/plans/stepsize.jl
#	src/solvers/gradient_descent.jl
#	test/solvers/test_trust_regions.jl
* use grad instead of ∇ to denote the gradient
* use gradient within options (instead of ∇) to denote the current gradient
* use gradient!! within probems (instead of ∇) to denote the gradient function gradF
* use ˚\operatorname{grad}` within math formular (instead of ∇) to denote the gradient
* use `grad_` (instead of `∇`) to prefix gradient functions
* introduce a notation page in the documentation.
* adapt Newton (still with allocation always) to new scheme).
# Conflicts:
#	src/solvers/quasi_Newton.jl
@kellertuer
Copy link
Member Author

This PR is finished, we just need JuliaManifolds/Manifolds.jl#334 to be merged.

@kellertuer kellertuer changed the title [WIP] Introduce mutating functions for Problem member functions Introduce mutating functions for Problem member functions Mar 8, 2021
@kellertuer kellertuer merged commit 1dc0d79 into master Mar 9, 2021
@kellertuer kellertuer deleted the mutating-evalutations branch March 9, 2021 08:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants