-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Vectorised simulations #116
Conversation
Here is a tricky thing to consider: What effect does it have on gradients when we conditionally set some value in a computation to a constant. Is that effect bad? If so, how do we fix it? E.g. |
We also need to consider input dimension flexibility. For now, most of the reworked code probably assumes one input dimension for beams and one for settings. Maybe the final code can be more flexible, i.e. allow any number of input dimensions for both ... kind of like having n-dimensional batches. |
I've made the decision now, that the batch dimensions of the incoming beam and the settings must match. We should add a way to broadcast them, i.e. if a beam only has a single value or if a some elements only have single value, they are broadcast to the dimensions of the batched input. This should not be done if any of them has a different batch dimension, which should not be allowed and hence fail. |
Looking at the |
Some outstanding todos:
A further todo from 27.03.2024:
|
This will probably be a |
@cr-xu I'm getting the Merging is blocked message again. I would like to figure out this time what is causing it. The only restriction I could find in the settings is the one I set a long time ago that you cannot push directly to |
Nevermind ... I fixed it. The branch locking option seems to have been the issue |
Okay ... I didn't really fix it, but I also don't get it. The branch locking option seems to be what causes it to tell us that we cannot merge this PR. At first I didn't want to disable it because then I was able to push to master without a PR. It turns out though, that I can do that even when branch locking is turn on, presumably because I'm an admin. So I turned it off and now we can merge PRs again. I don't like that I'm able to accidentally push to master and as far as I understand the current settings should prevent an admin to push to master ... but clearly they don't. I also googled this and it keeps pointing to an option "Include administrators", but I cannot seem to find that one. I feels a little bit like it may have been renamed or removed (?) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me.
I'm not so sure about pushing .vscode/ltex...
stuff though, but it's not a big issue.
Just for reference: Even with your approval, reactivating the branch lock blocks the merge. |
[WIP] Code changes to account for desy-ml#116
Description
Makes all simulations in Cheetah vectorised, as you would expect in normal neural network model implementations.
Motivation and Context
Vectorised simulations should allow for much faster parallel simulation of beams and settings. This is very similar to how you might run a neural network with batches, and helps both speed-based and gradient-based applications.
Closes #100.
Types of changes
Checklist
flake8
(required).pytest
tests pass (required).pytest
on a machine with a CUDA GPU and made sure all tests pass (required).Note: We are using a maximum length of 88 characters per line