Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor code to make it Optimization.jl compatible #196

Merged
merged 8 commits into from
Dec 18, 2022

Conversation

Vaibhavdixit02
Copy link
Member

No description provided.

gcfg = ForwardDiff.GradientConfig(cost_function, autodiff_prototype, autodiff_chunk)
g! = (x, out) -> ForwardDiff.gradient!(out, cost_function, x, gcfg)
elseif flsa_gradient
if flsa_gradient
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is also an autodiff option

end
cost_function(p)
else
cost_function2 = nothing
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the cost_function2 can just in general be removed.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah even I thought that but wasn't sure if it would be too breaking

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Go for broke: This should get a major release with this change anyways. Just make these functions that return objective functions into functions that return an OptimizationProblem or something of the sort.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay cool

Comment on lines 90 to 92
est_sol = preview_est_sol[i]
_du = f(est_sol, p, tpoints[i])
cost += sum(abs2, vec(preview_est_deriv[i]) .- vec(_du))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

only oop?

@Vaibhavdixit02 Vaibhavdixit02 force-pushed the optimizationintegration branch 2 times, most recently from d470e20 to c84e0cb Compare December 17, 2022 08:16
@Vaibhavdixit02 Vaibhavdixit02 force-pushed the optimizationintegration branch 2 times, most recently from 06f7cf4 to 4b4e394 Compare December 18, 2022 09:50
@Vaibhavdixit02 Vaibhavdixit02 changed the title [WIP] Refactor code to make it Optimization.jl compatible Refactor code to make it Optimization.jl compatible Dec 18, 2022
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
LsqFit = "2fda8390-95c7-5789-9bda-21331edee243"
PenaltyFunctions = "06bb1623-fdd5-5ca2-a01c-88eae3ea319e"
PreallocationTools = "d236fae5-4411-538c-8e31-a6e3d9e00b46"
RecursiveArrayTools = "731186ca-8d62-57ce-b412-fbd966d074cd"
SciMLBase = "0bca4576-84f4-4d90-8ffe-ffa030f20462"
SciMLSensitivity = "1ed8b502-d754-442c-8d5d-10ac956f44a1"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it might be good to keep a SciMLSensitivity because users just get a weird error telling them to add it if they use AutoZygote, and I think it won't be too uncommon to do that.

That said, a better solution may be to make SciMLSensitivity be a weak dependency of DiffEqBase that is added and used whenever Zygote, ReverseDiff, or Tracker are used. @oscardssmith is that possible?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah that would work

Comment on lines 117 to -201
opt = Opt(:LN_BOBYQA, 3)
lower_bounds!(opt, [9.0, 20.0, 2.0])
upper_bounds!(opt, [11.0, 30.0, 3.0])
min_objective!(opt, obj_short.cost_function2)
xtol_rel!(opt, 1e-12)
maxeval!(opt, 10000)
@time (minf, minx, ret) = NLopt.optimize(opt, [9.0, 20.0, 2.0])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

beautiful

@ChrisRackauckas
Copy link
Member

Let's merge but not tag until docs

@ChrisRackauckas ChrisRackauckas merged commit f27fb46 into master Dec 18, 2022
@ChrisRackauckas ChrisRackauckas deleted the optimizationintegration branch December 18, 2022 12:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants