Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

define upto_gradient for MeritObjective #36

Open
longemen3000 opened this issue Jun 14, 2022 · 1 comment
Open

define upto_gradient for MeritObjective #36

longemen3000 opened this issue Jun 14, 2022 · 1 comment

Comments

@longemen3000
Copy link
Contributor

longemen3000 commented Jun 14, 2022

something like this?

function NLSolvers.upto_gradient(meritobj::NLSolvers.MeritObjective, ∇f, x)
    neq = meritobj.prob
    G = neq.R.F(∇f, x)
    F =  (norm(G)^2) / 2
    return F,G
end

with that, NLSolvers.solve(prob,x0,LineSearch(Newton(),HZAW())) seems to work
EDIT: no it doesn't. but at least it hits an error in HZAW instead:
MethodError: no method matching isfinite(::NamedTuple{(:ϕ, :Fx), Tuple{Float64, StaticArrays.MVector{2, Float64}}})

while !isfinite(φc) && iter <= maxiter

@longemen3000
Copy link
Contributor Author

is this related?

# Need to restrict to static and backtracking here because we don't allow
# for methods that calculate the gradient of the line objective.
#
# For non-linear systems of equations we choose the sum-of-
# squares merit function. Some useful things to remember is:
#
# f(y) = 1/2*|| F(y) ||^2 =>
# ∇_df = -d'*J(x)'*F(x)
#
# where we remember the notation x means the current iterate and y is any
# proposal. This means that if we step in the Newton direction such that d
# is defined by
#
# J(x)*d = -F(x) => -d'*J(x)' = F(x)' =>
# ∇_df = -F(x)'*F(x) = -f(x)*2
#

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant