-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Propagate model parameter names to optimizers #1536
Conversation
Customizable via optional |
Codecov Report
@@ Coverage Diff @@
## master #1536 +/- ##
=======================================
Coverage 97.65% 97.66%
=======================================
Files 63 63
Lines 4006 4023 +17
Branches 565 571 +6
=======================================
+ Hits 3912 3929 +17
Misses 55 55
Partials 39 39
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor comments (which I'll check now on the formatting and merge if it looks okay), but this all looks good.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, as that checks out in the docs build this all LGTM. Thanks @kratsg!
Pull Request Description
Resolves #1099. This passes through parameter names to the optimizers, both scipy and minuit. This is done by adding
ModelConfig.par_names()
which returns the parameter names in a format that has yet to be agreed upon (for indexing n-binned parameters). This function output is threaded through the optimizer framework via theOptimizer._get_minimizer()
function call. However, scipy does not make use of this, so not much changes except to support the updated function signature, while Minuit is able to take advantage of this.A format string is allowed as an optional keyword argument so we're not locked into a specific format.
Checklist Before Requesting Reviewer
Before Merging
For the PR Assignees: