-
Notifications
You must be signed in to change notification settings - Fork 242
Gradient optimizers #1328
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Gradient optimizers #1328
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #1328 +/- ##
========================================
Coverage 95.24% 95.24%
========================================
Files 814 814
Lines 69309 69312 +3
========================================
+ Hits 66015 66019 +4
+ Misses 3294 3293 -1
Continue to review full report in Codecov by Sentry.
🚀 New features to boost your workflow:
|
adds a few gradient optimizers
everything is policy based, so these should be quite extensible. I may add a few more optimziers in the future, but these are the main ones i think. If there are any specific ones that you guys think should be added, I'd be happy to do so. Also although everything is reverse-mode autodiff centric, as long as you provide the objective function, a way to evaluate it, and a way to evaluate the gradient, everything should work correctly.
For some examples on how to use the optimziers:
test_gradient_descent_optimizer.cpp
,test_nesterov_optimizer.cpp
, andtest_lbfgs.cpp
should be good starting points.I'm working on the documentation currently. I wanted to hold off to see if any major revisions are necessary