Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
fzimmermann89 committed Jan 14, 2025
1 parent 2156518 commit ac37591
Show file tree
Hide file tree
Showing 10 changed files with 23 additions and 23 deletions.
2 changes: 1 addition & 1 deletion docs/source/_templates/class_template.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

.. autoclass:: {{ objname }}
:members:
:special-members: '__init__, __matmul__, __add__, __mul__, __or__, __and__, __radd__, __rmul__, __rmatmul__, __ror__, __rand__, __truediv__, __eq__, __pow__'
:special-members: '__init__, __call__, __matmul__, __add__, __mul__, __or__, __and__, __radd__, __rmul__, __rmatmul__, __ror__, __rand__, __truediv__, __eq__, __pow__'
:inherited-members: Module
:show-inheritance:

Expand Down
1 change: 1 addition & 0 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@
'sphinx.ext.intersphinx',
'sphinx_autodoc_typehints',
'sphinx.ext.autosectionlabel',
'sphinx-copybutton'
]


Expand Down
4 changes: 2 additions & 2 deletions docs/source/contributor_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -84,13 +84,13 @@ You can use VSCode's test panel to discover and run tests. All tests must pass b

Building the Documentation
==========================
You can build the documentation locally via running ```make html``` in the docs folder. The documentation will also be build in each PR and can be viewed online.
You can build the documentation locally via running ``make html`` in the docs folder. The documentation will also be build in each PR and can be viewed online.
Please check how your new additions render in the documentation before requesting a PR review.


Adding new Examples
===================
New exciting applications of MRpro can be added in ```examples``` as only ```.py``` files with code-cells. These can, for example, be used in VSCode with the python extension, or in JupyterLab with the `jupytext <https://jupytext.readthedocs.io/en/latest/>`_ extension.
New exciting applications of MRpro can be added in ``examples`` as only ``.py`` files with code-cells. These can, for example, be used in VSCode with the python extension, or in JupyterLab with the `jupytext <https://jupytext.readthedocs.io/en/latest/>`_ extension.
A pre-commit action will convert the scripts to notebooks. Our documetantion build will pick up these notebooks, run them, and include them with outputs in the documentation.
The data to run the examples should be publicly available and hosted externally, for example at zenodo.
Please be careful not to add any binary files to your commits.
Expand Down
2 changes: 1 addition & 1 deletion docs/source/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Examples
:target: https://colab.research.google.com/github/PTB-MR/mrpro

Notebooks with examples of how you can use MRpro.
Each notebook can be launched in Colab |colab-badge|
Each notebook can be launched in Colab: |colab-badge|

.. toctree::
:maxdepth: 1
Expand Down
3 changes: 1 addition & 2 deletions docs/source/user_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,7 @@ A basic pipeline would contain the following steps:

The following provides some basic information about these steps.
For more detailed information please have a look at the :doc:`examples`.
You can easily launch notebooks via the |colab-badge| badge and give the notebooks a try without having to
install anything.
You can easily launch notebooks via the |colab-badge| badge and give the notebooks a try -

Reading in raw data
-------------------
Expand Down
14 changes: 7 additions & 7 deletions src/mrpro/algorithms/optimizers/adam.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,13 +35,13 @@ def adam(
\hat{m}_t &= \frac{m_t}{1 - \beta_1^t}, \quad \hat{v}_t = \frac{v_t}{1 - \beta_2^t} \\
\theta_{t+1} &= \theta_t - \frac{\eta}{\sqrt{\hat{v}_t} + \epsilon} \hat{m}_t
where:
- :math:`g_t` is the gradient at step :math:`t`,
- :math:`m_t` and :math:`v_t` are biased estimates of the first and second moments,
- :math:`\hat{m}_t` and :math:`\hat{v}_t` are bias-corrected estimates,
- :math:`\eta` is the learning rate,
- :math:`\epsilon` is a small constant for numerical stability,
- :math:`\beta_1` and :math:`\beta_2` are decay rates for the moment estimates.
where
:math:`g_t` is the gradient at step :math:`t`,
:math:`m_t` and :math:`v_t` are biased estimates of the first and second moments,
:math:`\hat{m}_t` and :math:`\hat{v}_t` are bias-corrected estimates,
:math:`\eta` is the learning rate,
:math:`\epsilon` is a small constant for numerical stability,
:math:`\beta_1` and :math:`\beta_2` are decay rates for the moment estimates.
Steps of the Adam algorithm:
Expand Down
2 changes: 1 addition & 1 deletion src/mrpro/data/Data.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ class Data(MoveDataMixin, ABC):
"""A general data class with field data and header."""

data: torch.Tensor
"""Data. Shape (...other coils k2 k1 k0)"""
"""Data. Shape `(...other coils k2 k1 k0)`"""

header: Any
"""Header information for data."""
4 changes: 2 additions & 2 deletions src/mrpro/data/IData.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,9 +76,9 @@ def from_tensor_and_kheader(cls, data: torch.Tensor, kheader: KHeader) -> Self:
Parameters
----------
data
torch.Tensor containing image data with dimensions (broadcastable to) (other, coils, z, y, x).
torch.Tensor containing image data with dimensions (broadcastable to) `(other, coils, z, y, x)`.
kheader
MR raw data header (KHeader) containing required meta data for the image header (IHeader).
MR raw data header containing required meta data for the image header (`mrpro.data.IHeader`).
"""
header = IHeader.from_kheader(kheader)
return cls(header=header, data=data)
Expand Down
6 changes: 3 additions & 3 deletions src/mrpro/operators/models/WASABI.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,13 +82,13 @@ def forward(
signal with shape `(offsets *other, coils, z, y, x)`
"""
offsets = unsqueeze_right(self.offsets, b0_shift.ndim - (self.offsets.ndim - 1)) # -1 for offset
delta_x = offsets - b0_shift
delta_b = offsets - b0_shift
b1 = self.b1_nominal * relative_b1

signal = (
c
- d
* (torch.pi * b1 * self.gamma * self.tp) ** 2
* torch.sinc(self.tp * torch.sqrt((b1 * self.gamma) ** 2 + delta_x**2)) ** 2
* (torch.pi * b1 * self.gamma * self.rf_duration) ** 2
* torch.sinc(self.rf_duration * torch.sqrt((b1 * self.gamma) ** 2 + delta_b**2)) ** 2
)
return (signal,)
8 changes: 4 additions & 4 deletions src/mrpro/operators/models/WASABITI.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ def __init__(
rf_duration: float | torch.Tensor = 0.005,
b1_nominal: float | torch.Tensor = 3.75,
gamma: float | torch.Tensor = 42.5764,
lamor_frequency: float | torch.Tensor = 127.7292,
larmor_frequency: float | torch.Tensor = 127.7292,
) -> None:
"""Initialize WASABITI signal model for mapping of B0, B1 and T1 [SCH2023]_.
Expand All @@ -33,7 +33,7 @@ def __init__(
nominal B1 amplitude [µT]
gamma
gyromagnetic ratio [MHz/T]
lamor_frequency
larmor_frequency
larmor frequency [MHz]
References
Expand All @@ -47,7 +47,7 @@ def __init__(
rf_duration = torch.as_tensor(rf_duration)
b1_nominal = torch.as_tensor(b1_nominal)
gamma = torch.as_tensor(gamma)
lamor_frequency = torch.as_tensor(lamor_frequency)
larmor_frequency = torch.as_tensor(larmor_frequency)

if recovery_time.shape != offsets.shape:
raise ValueError(
Expand All @@ -60,7 +60,7 @@ def __init__(
self.rf_duration = nn.Parameter(rf_duration, requires_grad=rf_duration.requires_grad)
self.b1_nominal = nn.Parameter(b1_nominal, requires_grad=b1_nominal.requires_grad)
self.gamma = nn.Parameter(gamma, requires_grad=gamma.requires_grad)
self.lamor_frequency = nn.Parameter(lamor_frequency, requires_grad=lamor_frequency.requires_grad)
self.larmor_frequency = nn.Parameter(larmor_frequency, requires_grad=larmor_frequency.requires_grad)

def forward(self, b0_shift: torch.Tensor, relative_b1: torch.Tensor, t1: torch.Tensor) -> tuple[torch.Tensor,]:
"""Apply WASABITI signal model.
Expand Down

0 comments on commit ac37591

Please sign in to comment.