Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Speed up GLMM mode #257

Open
jr-leary7 opened this issue Oct 18, 2024 · 2 comments
Open

Speed up GLMM mode #257

jr-leary7 opened this issue Oct 18, 2024 · 2 comments
Assignees
Labels
enhancement New feature or request GLMM related to the GLMM model backend

Comments

@jr-leary7
Copy link
Owner

  • remove fitting of the subject-specific GLM models within fitGLMM()
    • this requires som re-working of various bits and ends but should be feasible
  • see if it's possible to speed up the NB LASSO bit, as per profiling with profvis that is the largest runtime piece
    • maybe supply a smaller / more intelligent set of values of the penalty parameter $\lambda$ ? right now 50 values are considered, but the values themselves are chosen automatically
@jr-leary7 jr-leary7 added enhancement New feature or request GLMM related to the GLMM model backend labels Oct 18, 2024
@jr-leary7 jr-leary7 self-assigned this Oct 18, 2024
@jr-leary7
Copy link
Owner Author

Also figure out whether MNET penalty actually induces sparsity or not ...

@jr-leary7
Copy link
Owner Author

Dumb question -- MNET does induce sparsity of the coefficients a la LASSO. maybe add an argument to fitGLMM() specifying whether LASSO, MNET, or SNET penalties should be used ? do an analysis to determine which results in "most sparse" outcomes, as that's likely what we want

jr-leary7 added a commit that referenced this issue Oct 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request GLMM related to the GLMM model backend
Projects
None yet
Development

No branches or pull requests

1 participant