Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[dartpy] Add optimizer APIs with GradientDescent and nlopt solvers #1325

Merged
merged 10 commits into from
May 19, 2019

Conversation

jslee02
Copy link
Member

@jslee02 jslee02 commented May 19, 2019

Example code:

import math
import dartpy as dart


# Problem reference: http://ab-initio.mit.edu/wiki/index.php/NLopt_Tutorial
class SampleObjFunc(dart.optimizer.Function):
    def eval(self, x):
        return math.sqrt(x[1])

    def evalGradient(self, x, grad):
        grad[0] = 0
        grad[1] = 0.5 / (math.sqrt(x[1]) + 0.000001)


class SampleConstFunc(dart.optimizer.Function):
    def __init__(self, a, b):
        super(SampleConstFunc, self).__init__()
        self.a = a
        self.b = b

    def eval(self, x):
        a = self.a
        b = self.b
        return (a*x[0] + b) * (a*x[0] + b) * (a*x[0] + b) - x[1];

    def evalGradient(self, x, grad):
        a = self.a
        b = self.b
        grad[0] = 3 * a * (a*x[0] + b) * (a*x[0] + b);
        grad[1] = -1.0;


# Problem settings
prob = dart.optimizer.Problem(2)
prob.setLowerBounds([-1e100, 0])
prob.setInitialGuess([1.234, 5.678])
prob.setObjective(SampleObjFunc())
prob.addIneqConstraint(SampleConstFunc(2, 0))
prob.addIneqConstraint(SampleConstFunc(-1, 1))

# Solve
solver = dart.optimizer.NloptSolver(prob)
solver.setAlgorithm(dart.optimizer.NloptSolver.Algorithm.LD_MMA)
solver.solve()

# Solutions
min_f = prob.getOptimumValue()
opt_x = prob.getOptimalSolution()

Before creating a pull request

  • Document new methods and classes
  • Format new code files using clang-format

Before merging a pull request

  • Set version target by selecting a milestone on the right side
  • Summarize this change in CHANGELOG.md
  • Add unit test(s) for this change

@jslee02 jslee02 added this to the DART 6.9.0 milestone May 19, 2019
@jslee02 jslee02 marked this pull request as ready for review May 19, 2019 19:27
@codecov
Copy link

codecov bot commented May 19, 2019

Codecov Report

Merging #1325 into master will decrease coverage by 0.17%.
The diff coverage is 17.64%.

@@            Coverage Diff             @@
##           master    #1325      +/-   ##
==========================================
- Coverage    57.1%   56.93%   -0.18%     
==========================================
  Files         366      366              
  Lines       27050    27156     +106     
==========================================
+ Hits        15447    15460      +13     
- Misses      11603    11696      +93
Impacted Files Coverage Δ
dart/optimizer/nlopt/NloptSolver.cpp 22.5% <17.64%> (-20.06%) ⬇️
dart/dynamics/Skeleton.cpp 66.12% <0%> (+0.16%) ⬆️
dart/dynamics/EulerJoint.cpp 70.73% <0%> (+3.04%) ⬆️

@jslee02 jslee02 merged commit 7cea082 into master May 19, 2019
@jslee02 jslee02 deleted the dartpy/optimizer branch May 19, 2019 22:48
@jslee02 jslee02 changed the title [pydart] Add optimizer APIs with GradientDescent and nlopt solvers [dartpy] Add optimizer APIs with GradientDescent and nlopt solvers May 20, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant