Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to PRIMA.bobyqa for optimization? #742

Open
dmbates opened this issue Feb 7, 2024 · 1 comment
Open

Switch to PRIMA.bobyqa for optimization? #742

dmbates opened this issue Feb 7, 2024 · 1 comment

Comments

@dmbates
Copy link
Collaborator

dmbates commented Feb 7, 2024

See the discussion in lme4/lme4#744 regarding PRIMA.bobyqa in linear mixed models. It may be even more useful to switch to PRIMA.bobyqa for GLMM optimization because that version of bobyqa allows for a scale argument which can be very helpful when simultaneously optimizing the fixed-effects and variance-component parameters.

If we make this change should we retain the ability to use the NLopt optimizers or allow only the PRIMA optimizers? It would be good to retain NLopt and hence compatibility with earlier versions but it could get messy with the optsum structure and with tests - not to mention the potential for some really ugly and confusing code.

@zaikunzhang
Copy link

Hi @dmbates ,

I am very glad that PRIMA turns out useful to you and your community. The Julia interface was entirely due to the efforts of @emmt and @amontoison. Ping us if you need our help.

Cheers,
Zaikun

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants