-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A very basic working example to learn samples #142
Conversation
examples/example_learn_samples.py
Outdated
density=True, | ||
squeeze_dims=False, | ||
) | ||
self.trajectory = self.operator.samples_torch |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just done for simplicity
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe you want to make it a property ?
src/mrinufft/operators/autodiff.py
Outdated
if hasattr(self.nufft_op, name): | ||
if name == 'samples': | ||
return self.samples_torch | ||
return getattr(self.nufft_op, name) | ||
else: | ||
return super().__getattr__(name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will make sure we are similar to torch.nn.module
while also being FourierOperatorBase
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
see above
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A good start, adding an example for the autodiff is clearly required ! However I would recommend to add more detail in the example to explains what is happening.
examples/example_learn_samples.py
Outdated
""" | ||
============================= | ||
Density Compensation Routines | ||
============================= | ||
|
||
Examples of differents density compensation methods. | ||
|
||
Density compensation depends on the sampling trajectory,and is apply before the | ||
adjoint operation to act as preconditioner, and should make the lipschitz constant | ||
of the operator roughly equal to 1. | ||
|
||
""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A copy pasta to update I suppose.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
AHh no no, its just a temporary file. Let me mark this as draft
examples/example_learn_samples.py
Outdated
density=True, | ||
squeeze_dims=False, | ||
) | ||
self.trajectory = self.operator.samples_torch |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe you want to make it a property ?
src/mrinufft/operators/autodiff.py
Outdated
self.samples_torch = torch.nn.Parameter( | ||
data=torch.Tensor(self.nufft_op.samples), | ||
requires_grad=True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would make it a private attribute, thats for internal cooking, users are expected to use .samples
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But samples will route to samples_torch. But I understand your point, lemme do it. Its just that I am wondering if torch can handle the routing and properties right for autodiff.
src/mrinufft/operators/autodiff.py
Outdated
if hasattr(self.nufft_op, name): | ||
if name == 'samples': | ||
return self.samples_torch | ||
return getattr(self.nufft_op, name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't make the exception here, just add a property. Also this will break if you don't have it initialized.
@property
def samples(self):
try:
return self._samples_torch
except AttributeError:
return self.nufft_op.samples
def __getattr__(self, name):
return getattr(self.nufft_op, name)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahhh good point, but then if we are in autodiff module with grad_wrt_traj, we SHOULD have it initialized.
But I dont know if it misses any corner cases (perhaps if we want grad_wrt_data and request samples
src/mrinufft/operators/autodiff.py
Outdated
if hasattr(self.nufft_op, name): | ||
if name == 'samples': | ||
return self.samples_torch | ||
return getattr(self.nufft_op, name) | ||
else: | ||
return super().__getattr__(name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
see above
if "gpu" in interface: | ||
operator.to("cuda") | ||
else: | ||
operator.cpu() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what does this mean ? operator is not a tensor here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
operator
is a torch.nn.module
here. This means all the underlying parameters / weights will be on GPU or CPU. It is very "torch-like" code, and this is good for us
examples/example_learn_samples.py
Outdated
class Model(torch.nn.Module): | ||
def __init__(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do you need the extra module layer ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think and expect this is how users will use our operator. The test
handles the case when they dont instance in a module and the example handles it being instanced in a module. Note that this was failing earlier, and hence we have this PR to begin with. Eventually we might want the model to dynamically update the shape of the trajectory (during upsampling steps for example.
Closing as #167 is now here! |
This had to be setup after discussing with @alineyyy , where we found some issues due to which it was harder to instance the operator inside a torch module. @paquiteau , maybe a quick review of the code to ensure we agree on style (we can meet if required). @alineyyy please update the learn samples codes to make it cleaner. Please also add a simple projection step to ensure we sample the center of k-space.