Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fast laplace via initialization #88

Merged
merged 5 commits into from
Jul 14, 2021
Merged

fast laplace via initialization #88

merged 5 commits into from
Jul 14, 2021

Conversation

jordandekraker
Copy link
Collaborator

I recycled some old matlab code to get a fast (and good quality) initialization for the lapalce solver. Only dependency is scikit-fmm which is pretty light.
This could be used in conjunction with image pyramids as in #85, but honestly I think this is fast enough now.

Tested and working on my end!

@jordandekraker jordandekraker requested a review from akhanf July 10, 2021 16:38
Jordan DeKraker - B. Bernhardt Lab and others added 3 commits July 10, 2021 12:40
the backwards fastmarch solution is weighted less heavily near 1 (its maximum) (by squaring) and then it is flipped (-backward +1)
Copy link
Member

@akhanf akhanf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know if its a great idea to make this combined initialization the new default..
The combination is not really principled by just averaging them together.. I think it would be more sensible to have an option to select the initialization type.

@jordandekraker
Copy link
Collaborator Author

Could also just keep the forward or the backward. The only problem is the end point from one direction doesn't match the start point from the other, so I wanted the combination to be weighted more heavily towards the start of each march. If that's too complicated then just one of the two marches would still be a good initialization

@jordandekraker
Copy link
Collaborator Author

Now that I think about it, if i want to weight their combination, then maybe I should flip them both, square them, flip one back, average them, and then square root the whole thing.

Whatever the combination method is (just using one direction, averaging them, or doing a weighted average) is still a huge improvement over just initializing with a value of 0.5, and so choosing a combination method is a pretty minor change in comparison.

@akhanf
Copy link
Member

akhanf commented Jul 12, 2021 via email

@jordandekraker
Copy link
Collaborator Author

Nope, this is only for when there is not shape injection (and therefore no initialization injection).

@jordandekraker
Copy link
Collaborator Author

Did some quick testing and just averaging the forward and backward march seems to consistently do best (weighting averaging was getting very complicated). Can probably merge anytime

@akhanf akhanf merged commit 8ac2edf into master Jul 14, 2021
@akhanf akhanf deleted the fast_laplace_init branch July 14, 2021 20:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants