[r] Add feature selection methods by variance, dispersion, and mean accessibility #169
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Details
Create functions to do feature selection, as a foundation for LSI and iterative LSI. These take in the number of features, and an optional function to be passed in for noramlization (if function uses variance or dispersion). The end result is a tibble with columns names, score, and highly_variable.
Tests
Since the interfaces are very similar, I just decided to throw all of them in a loop and test that the tibbles are formed as we expect. I don't know whether it would make sense to test for whether the actual feature selection logic makes sense, because that would just be re-doing the operations on a dgCMatrix. Otherwise, do you have test ideas with better signal on whether these methods perform as we expect?
Notes
I have this merging to the normalization branch, but I just do this to allow for the normalization logic to work within feature selection. I think it would make sense for merging normalizations into main once that is approved, then setting the head to main.
I think the underlying logic is essentially the same between each feature selection method, so I am leaning closer and closer to just putting all of the logic into a single function with a enum param for usage of a specific feature selection method. However, this might clash with LSI/iterative LSI unless we are okay with putting a
purrr::partial()
eval statement directly in the default args. That is, until we develop the option to do implicit partials.