-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add a functional API for optimization #98
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This commit introduces an alternative API for EvoTorch that conforms to the functional programming paradigm. This functional API can be used together with `torch.func.vmap`, and therefore can be used for optimizing not just a single population, but a batch of populations simultaneously. The main improvements are: (i) Functional counterparts of cross entropy method (`cem`) and policy gradients with parameter-based exploration (`pgpe`) are implemented. These algorithm implementations can be used with `vmap`, or can be given batches of starting points (`center_init` arguments) so that they will generate batches of populations centered around them. (ii) Functional counterparts of gradient-based optimizers `adam`, `clipup`, and `sgd` are implemented. The interfaces of these optimizers are similar to the interfaces of functional `cem` and `pgpe`. Therefore, the user can switch back and forth between evolutionary approach and gradient-based approach for solving a problem, with minimum amount of code change. (iii) The decorator `@expects_ndim` is introduced. This decorator is used for declaring how many dimensions are expected per each positional argument of the decorated function. Upon receiving tensors whose number of dimensions are more than expected, the decorated function interprets those tensors as batched arguments, applies `vmap` on itself, and performs its operations across the batch dimensions. (iv) The decorator `@rowwise` is introduced. This decorator is used for declaring that a function is implemented with the assumption that its argument is a vector. If it receives a tensor with 2 or more dimensions, the function applies `vmap` on itself, and performs its operations across multiple batch dimensions.
Co-authored-by: Rupesh K Srivastava <rupesh@nnaisense.com>
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## master #98 +/- ##
==========================================
+ Coverage 76.74% 77.36% +0.61%
==========================================
Files 49 57 +8
Lines 7509 8213 +704
==========================================
+ Hits 5763 6354 +591
- Misses 1746 1859 +113 ☔ View full report in Codecov by Sentry. |
…de more detailed descriptions
This commit adds the classmethod with signature `functional_sample(num_solutions, params)` to the classes `SeparableGaussian` and `SymmetricSeparableGaussian`. These classmethods contain alternative implementations for generating samples from their distributions, completely in a stateless manner. Functional samplers created via the utility function `make_functional_sampler(...)` will now check whether or not the wrapped distribution has its classmethod `functional_sample(...)` (the other case being `functional_sample=NotImplemented`). If the classmethod does exist, the functional sampler will use that method to generate the samples. The goal of this new mechanism is to allow the distributions to have alternative sampling mechanisms that have better compatibility with the pure functional programming paradigm.
flukeskywalker
approved these changes
Jun 6, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This pull request introduces an alternative API for EvoTorch that conforms to the functional programming paradigm. This functional API can be used together with
torch.func.vmap
, and therefore can be used for optimizing not just a single population, but a batch of populations simultaneously.The main improvements are:
cem
) and policy gradients with parameter-based exploration (pgpe
) are implemented. These algorithm implementations can be used withvmap
, or can be given batches of starting points (center_init
arguments) so that they will generate batches of populations centered around them.adam
,clipup
, andsgd
are implemented. The interfaces of these optimizers are similar to the interfaces of functionalcem
andpgpe
. Therefore, the user can switch back and forth between evolutionary approach and gradient-based approach for solving a problem, with minimum amount of code change.@expects_ndim
is introduced. This decorator is used for declaring how many dimensions are expected per each positional argument of the decorated function. Upon receiving tensors whose number of dimensions are more than expected, the decorated function interprets those tensors as batched arguments, appliesvmap
on itself, and performs its operations across the batch dimensions.@rowwise
is introduced. This decorator is used for declaring that a function is implemented with the assumption that its argument is a vector. If it receives a tensor with 2 or more dimensions, the function appliesvmap
on itself, and performs its operations across multiple batch dimensions.