Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: MIRTorch: A Differentiable Medical Image Reconstruction Toolbox #7340

Open
editorialbot opened this issue Oct 9, 2024 · 15 comments
Assignees
Labels
Python review TeX Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Oct 9, 2024

Submitting author: @ GuanhuaW (Guanhua Wang)
Repository: https://github.com/guanhuaw/MIRTorch
Branch with paper.md (empty if default branch): feature/joss
Version: v0.1.2
Editor: @ymzayek
Reviewers: @paquiteau, @jonbmartin
Archive: Pending

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/ab29a08651ff1a8da37b363bd6185820"><img src="https://joss.theoj.org/papers/ab29a08651ff1a8da37b363bd6185820/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/ab29a08651ff1a8da37b363bd6185820/status.svg)](https://joss.theoj.org/papers/ab29a08651ff1a8da37b363bd6185820)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@paquiteau & @jonbmartin, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @ymzayek know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @paquiteau

📝 Checklist for @jonbmartin

@editorialbot editorialbot added Python review TeX Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials labels Oct 9, 2024
@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1109/TMI.2018.2833635 is OK
- 10.1109/TMI.2018.2865356 is OK
- 10.1137/080716542 is OK
- 10.1007/s10851-010-0251-1 is OK
- 10.1561/2400000003 is OK
- 10.1002/mrm.29645 is OK
- 10.1002/mrm.24389 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: Michigan Image Reconstruction Toolbox
- No DOI given, and none found for title: SigPy: a python package for high performance itera...
- No DOI given, and none found for title: The BART toolbox for computational magnetic resona...

❌ MISSING DOIs

- None

❌ INVALID DOIs

- 10.6028/igres.049.044 is INVALID

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.90  T=0.08 s (1346.0 files/s, 181196.9 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          44           1435           2496           6130
Jupyter Notebook                 6              0           2343            631
Markdown                         5            130              0            270
YAML                             6             27             18            187
reStructuredText                47            547            496            111
TeX                              1             10              0            109
TOML                             1              8              0             74
DOS Batch                        1              8              1             26
make                             1              4              7              9
-------------------------------------------------------------------------------
SUM:                           112           2169           5361           7547
-------------------------------------------------------------------------------

Commit count by author:

    39	m5520
    36	neelsh
    30	guanhuaw
    29	Guanhua
    18	Keyue Zhu
     5	quickstep
     4	SoniaMinseoKim
     3	Zongyu Li
     1	GuanhuaW
     1	Jeff Fessler
     1	nnmurthy
     1	ray@omni

@editorialbot
Copy link
Collaborator Author

Paper file info:

📄 Wordcount for paper.md is 717

✅ The paper includes a Statement of need section

@editorialbot
Copy link
Collaborator Author

License info:

🟡 License found: Other (Check here for OSI approval)

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@ymzayek
Copy link

ymzayek commented Oct 9, 2024

Dear @paquiteau & @jonbmartin, you can start your review by creating your tasklist with the following command:

@editorialbot generate my checklist

In that list, there are several tasks. Whenever you perform a task, you can check on the corresponding checkbox. You can also reference the JOSS reviewer guidelines which is linked in first comment in this thread. Since the review process of JOSS is interactive, you can always interact with the author, the other reviewers, and the editor during the process. You can open issues and pull requests in the target repo. Please mention the url of this page in there so that we can keep tracking what is going on.

Thank you in advance and please let me know at any time if you have any questions.

@editorialbot
Copy link
Collaborator Author

I'm sorry human, I don't understand that. You can see what commands I support by typing:

@editorialbot commands

@paquiteau
Copy link

paquiteau commented Oct 9, 2024

Review checklist for @paquiteau

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/guanhuaw/MIRTorch?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@ GuanhuaW) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1. Contribute to the software 2. Report issues or problems with the software 3. Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@jonbmartin
Copy link

jonbmartin commented Oct 9, 2024

Review checklist for @jonbmartin

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/guanhuaw/MIRTorch?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@ GuanhuaW) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1. Contribute to the software 2. Report issues or problems with the software 3. Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@paquiteau
Copy link

paquiteau commented Oct 9, 2024

Hello @guanhuaw,
I have read the MIRTorch paper, and it seems that the state of the art regarding MRI Image reconstruction toolboxes There are quite a few and clearly stating how MIRTorch is different from others toolboxes (What does it do better/faster/easier ?). Without being exhaustive, I think deepinv 1, pysap-mri2 and scico should be mentioned; they are python toolboxes for solving Images Inverse problems, with deep learning and/pr Compressed Sensing reconstruction, and MRI operators.

Regarding MRI capabilities, Our team maintains mind-inra/mri-nufft, which provides fast operators (at least faster than torchkbnufft, used by MIRTorch) for Non-Cartesian Imaging (and with autodiff capabilities as well, see our examples3).

I have also opened an issue on your repo regarding documentation and installation instruction.

Footnotes

  1. Full Disclosure: Some of my coworkers at Inria are maintainer of deepinv

  2. It is (not so actively) maintained in our team

  3. We are in the process of writing a JOSS paper as well

@ymzayek
Copy link

ymzayek commented Oct 22, 2024

Hi just checking in here to see if there are any updates. @guanhuaw don't hesitate to let me or the reviewers know if you have any questions concerning the comments, etc.

@jonbmartin
Copy link

Hi @guanhuaw and @ymzayek, I apologize for the delay in completing my review! I've completed some initial steps as indicated on the checklist and will finish my review by the end of the week.

Cheers,
Jonathan

@ymzayek
Copy link

ymzayek commented Nov 15, 2024

Hi everyone, just checking in to see how the review process is going. On the reviewer side, @jonbmartin and @paquiteau are your review checklists up to date? @guanhuaw if you make updates that are not captured by the reviewers don't hesitate to comment as such on this thread.

@jonbmartin
Copy link

jonbmartin commented Nov 18, 2024

Hi @guanhuaw and @ymzayek, here is my initial review:

  • Installation instructions: Would like to see required python version somewhere in the readme, docs page, etc. to facilitate casual check of python compatibility for potential users. Similarly, would also be nice to link to PyPi page somewhere in documentation and/or readme. (Aside: not a blocker for me w.r.t. review but would love to see installable through conda as well!)
  • Installation/functionality: Very smooth and slick! Verified on local system w/ CPU and GPU w/ CUDA 11.8. All example code runs nicely. The only issue I encountered was with loading the .h5 file for the demo_3d.ipynb example - detailed in issue .h5 data load in mri_3d example guanhuaw/MIRTorch#31. Not really MIRTorch-specific, just at FYI that I had trouble with that data. So not a block.
  • Functionality documentation: This feels a little lacking. The code tutorials give excellent examples but there isn't much API documentation. I think that Frank Ong did a great job with this for SigPy as an example: they 1) have pages outlining detailed usage for specific functions and 2) some generic code usage examples like this. I'd like to see a little more documentation developed before I check this box.

EDIT 11/20: Re: functionality documentation, I apologize I didn't notice all of the detail on readthedocs! e.g. https://mirtorch.readthedocs.io/en/latest/LinearMap.html. This is great and I consider that complete.

  • Community guidelines: Also a little lacking. I know github has a standard approach but the checklist asks for some specific guidelines, should probably add a bit more detail to satisfy.

Overall a huge fan! My critiques are really with documentation, the code itself seems solid by my judgement. Only 3 documentation boxes I haven't ticked.

Jon

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Python review TeX Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials
Projects
None yet
Development

No branches or pull requests

4 participants