Conversation
Member
|
THanks @cookpa . Looks fine to me. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
The MRF evaluation relies on a lot of neighborhood quantities that can be computed more efficiently, such as the distance between neighboring voxels.
Instead of doing a nested for loop
for all labels in classes
for all voxels in neighborhood
A single loop over all voxels in the neighborhood can sum up the distance weightings of voxels sharing each class, which can then be multiplied by the appropriate deltas.
This prevents computation time from blowing up so much when MRF neighborhood is > 1. It still takes longer, but it's about 50% faster with MRF radius 2x2x2.
This is backwards compatible with the existing implementation with the default ants cortical thickness parameters.
I did notice some differences on edge voxels when using partial volume classes. GPT suggests this is because ties are more likely because of the delta formulation: -2 (same label), -1 (common partial volume label) +1 (other labels). The combination of -2, -1, and +1 deltas could almost cancel out, leading to a dependency on small deltas from not doing the multiplication inside the nested loop.
This seems plausible to me. Without partial volume classes, the results are identical so I think the math is correct.