Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TIGRE-DE #521

Merged
merged 9 commits into from
Mar 1, 2024
Merged

TIGRE-DE #521

merged 9 commits into from
Mar 1, 2024

Conversation

akeeler
Copy link
Contributor

@akeeler akeeler commented Feb 28, 2024

Adds the functionality discussed in this paper: https://aapm.onlinelibrary.wiley.com/doi/10.1002/mp.17002

The main functionality is the DeDecompose.m function, which maps low- and high-kV CBCT projections to equivalent thicknesses of basis materials. A demo calibration is provided in the demo_de_cal.mat file, which contains a pair of calibrations mapping Varian Truebeam data with energy pairs (80 kV, 140 kV) or (90 kV, 140 kV) to Aluminum and polymethyl-methylacrylate basis materials. proj_interp.m is used to match projection angles to avoid aberrations in the basis materials due to mismatched projection data.

MakeVMproj.m takes the basis thickness projections from DeDecompose.m and uses them to synthesize Virtual Monoenergetic projections at a requested monoenergy between 20 and 150 keV.

@AnderBiguri
Copy link
Member

Hi @akeeler , fantastic work!

Can I ask some questions, to know better how to fit this into TIGRE well?

  • if I understand this right (apologies, haven't done a lot of DE myself) what you do here is take 2 energy level projections, and then do some magic to produce 2 material projections, correct?
  • To do this transform, the machine has to be calibrated for materials, is that right? is that what the .mat file is? If so, the .mat file contains then just 9x2 values, the parameters in equation (2) ?
  • VM projection seems to be something that DE researchers/users what to produce, a projection that mixes the other two. Am right? Is this type of projection always generated the same way? You provided example calibration values for the material decomposition, but the VM projection code has hardcoded numbers. How flexible is this? Or is it only valid for AL/PMMA?
  • Is any of the math here Varian specific in any other way other than the calibration? How generalizable is that calibration? i.e. can someone with a Varian scanner just use it, or does it need to be calibrated per machine?

Notes on coding (that depend a bit in previous answer)

  • We should make the /mat into a csv or something like that, if its only 18 numbers.
  • We should move it to ./Common/data
  • The DE code should have its own folder, in utilities, as IO is mostly
  • Can I make commits to this branch to change minor stuff (like input parsing/error handling) ?

@akeeler
Copy link
Contributor Author

akeeler commented Feb 29, 2024

Thanks for your questions @AnderBiguri. I'll do my best to answer them.

Yes, the key methodology is taking two energy level projections and converting them into two material projections. The method of doing this follows Eq. 2 in the paper. As you state, the coefficient for the transfer equations need to be calibrated, and that is what the .mat file is for. I tried to structure this .mat file after the Pixel2HU.mat file.

The calibration file currently contains two structs of 9x2 values, because the calibration depends on the material projections being created, the energies of the initial scans, and the scanner being used. So, anyone who wants to do DE imaging with a Varian Truebeam system, use Al and PMMA as basis materials, and images at either 80 and 140 kV or 90 and 140 kV should be able to use the provided calibrations out of the box, while further changes would necessitate a new calibration being added.

The VM projection construction implements Eq. 1 in the paper, so the hard-coded values are the Al and PMMA linear attenuation coefficients from NIST. These would only need to change if a user wanted to use different basis materials. On reflection, it might be best to put these in a CSV instead of hard-coding. VM projection creation is common, though not the only thing that DE imaging is used for, so the ability to create additional image types (like direct computation of electron density) can be added in future updates.

The only part of the math that is Varian-specific is the calibration file. The rest should be system-agnostic.

On the coding, feel free to make any adjustments to better fit the desired style and best practices. I'm certainly no coding expert! Also feel free to make commits that clean things up, and I'm happy to review or test any changes you make to ensure it still works properly.

@AnderBiguri
Copy link
Member

Fantastic @akeeler , thanks for the clarifications!
Then, if you are OK, I will (in the following days) make some commits to this PR (which will also change your code, as this PR is your repo) to refactor some things, explicitly error in some cases, etc. I'll let you know "when I am done", so you can have a look, verify and let me know if my changes make sense, and then we can merge it :)

@akeeler
Copy link
Contributor Author

akeeler commented Feb 29, 2024

Feel free. Please let me know if you need any permissions to add commits to my repository.

@AnderBiguri
Copy link
Member

@akeeler thanks! No, by you making a PR you implicitly allow me to commit to this branch in your repo ;) But wanted to ask anyway.

@AnderBiguri
Copy link
Member

@akeeler Can you test the current state of the branch? In theory I only added input parsing and changed how the data is stored.

Otherwise, maybe you can send me (privately) a test dataset to validate?

@akeeler
Copy link
Contributor Author

akeeler commented Mar 1, 2024

@AnderBiguri Thanks for all your work on this! I noticed and fixed a couple bugs with the input parser, but it runs cleanly now, and the error handling works well too.

I also reverted the interpolation model to being 'nearest' by default.

Per your question from earlier, sinogram interpolation is a fairly nontrivial problem to solve, and linear interpolation is a very non-ideal method of trying to solve it. Linear interpolation will essentially average the adjoining projections, so the interpolated projection won't generally look very much like the true intermediate projection except possibly for very small angular separations. I felt it important to include as an option in the toolkit since other works would use it, but I think the 'nearest' scheme is much more reliable for sinogram interpolation, so I'd prefer that it be left as the default.

Otherwise, I'm very happy with the code now. Thanks again!

@AnderBiguri
Copy link
Member

Fantastic! I will merge it then :) Thanks for the work!

@AnderBiguri AnderBiguri merged commit f6b03dc into CERN:master Mar 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants