Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A very basic working example to learn samples #142

Closed
wants to merge 22 commits into from

Conversation

chaithyagr
Copy link
Member

This had to be setup after discussing with @alineyyy , where we found some issues due to which it was harder to instance the operator inside a torch module. @paquiteau , maybe a quick review of the code to ensure we agree on style (we can meet if required). @alineyyy please update the learn samples codes to make it cleaner. Please also add a simple projection step to ensure we sample the center of k-space.

density=True,
squeeze_dims=False,
)
self.trajectory = self.operator.samples_torch
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is just done for simplicity

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe you want to make it a property ?

src/mrinufft/_utils.py Outdated Show resolved Hide resolved
Comment on lines 156 to 161
if hasattr(self.nufft_op, name):
if name == 'samples':
return self.samples_torch
return getattr(self.nufft_op, name)
else:
return super().__getattr__(name)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will make sure we are similar to torch.nn.module while also being FourierOperatorBase

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see above

Copy link
Member

@paquiteau paquiteau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A good start, adding an example for the autodiff is clearly required ! However I would recommend to add more detail in the example to explains what is happening.

Comment on lines 2 to 13
"""
=============================
Density Compensation Routines
=============================

Examples of differents density compensation methods.

Density compensation depends on the sampling trajectory,and is apply before the
adjoint operation to act as preconditioner, and should make the lipschitz constant
of the operator roughly equal to 1.

"""
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A copy pasta to update I suppose.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AHh no no, its just a temporary file. Let me mark this as draft

density=True,
squeeze_dims=False,
)
self.trajectory = self.operator.samples_torch
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe you want to make it a property ?

Comment on lines 141 to 143
self.samples_torch = torch.nn.Parameter(
data=torch.Tensor(self.nufft_op.samples),
requires_grad=True,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would make it a private attribute, thats for internal cooking, users are expected to use .samples

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But samples will route to samples_torch. But I understand your point, lemme do it. Its just that I am wondering if torch can handle the routing and properties right for autodiff.

Comment on lines 156 to 159
if hasattr(self.nufft_op, name):
if name == 'samples':
return self.samples_torch
return getattr(self.nufft_op, name)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't make the exception here, just add a property. Also this will break if you don't have it initialized.

@property
def samples(self):
    try:
        return self._samples_torch 
    except AttributeError:
        return self.nufft_op.samples

def __getattr__(self, name):
    return getattr(self.nufft_op, name)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ahhh good point, but then if we are in autodiff module with grad_wrt_traj, we SHOULD have it initialized.
But I dont know if it misses any corner cases (perhaps if we want grad_wrt_data and request samples

Comment on lines 156 to 161
if hasattr(self.nufft_op, name):
if name == 'samples':
return self.samples_torch
return getattr(self.nufft_op, name)
else:
return super().__getattr__(name)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see above

Comment on lines +78 to +81
if "gpu" in interface:
operator.to("cuda")
else:
operator.cpu()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what does this mean ? operator is not a tensor here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

operator is a torch.nn.module here. This means all the underlying parameters / weights will be on GPU or CPU. It is very "torch-like" code, and this is good for us

Comment on lines 27 to 28
class Model(torch.nn.Module):
def __init__(self):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you need the extra module layer ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think and expect this is how users will use our operator. The test handles the case when they dont instance in a module and the example handles it being instanced in a module. Note that this was failing earlier, and hence we have this PR to begin with. Eventually we might want the model to dynamically update the shape of the trajectory (during upsampling steps for example.

@chaithyagr chaithyagr marked this pull request as draft June 28, 2024 09:09
@chaithyagr
Copy link
Member Author

Closing as #167 is now here!

@chaithyagr chaithyagr closed this Jul 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants