Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Propagate model parameter names to optimizers #1536

Merged
merged 8 commits into from
Jul 29, 2021
Merged

Conversation

kratsg
Copy link
Contributor

@kratsg kratsg commented Jul 27, 2021

Pull Request Description

Resolves #1099. This passes through parameter names to the optimizers, both scipy and minuit. This is done by adding ModelConfig.par_names() which returns the parameter names in a format that has yet to be agreed upon (for indexing n-binned parameters). This function output is threaded through the optimizer framework via the Optimizer._get_minimizer() function call. However, scipy does not make use of this, so not much changes except to support the updated function signature, while Minuit is able to take advantage of this.

A format string is allowed as an optional keyword argument so we're not locked into a specific format.

Checklist Before Requesting Reviewer

  • Tests are passing
  • "WIP" removed from the title of the pull request
  • Selected an Assignee for the PR to be responsible for the log summary

Before Merging

For the PR Assignees:

  • Summarize commit messages into a comprehensive review of the PR
* Propagate model parameter names through to the optimizers (primarily Minuit)
   - Additionally allow a customizable format for the parameter names
* Add par_names method to _ModelConfig
* Add tests for _ModelConfig.par_names

@kratsg kratsg added feat/enhancement New feature or request optimization labels Jul 27, 2021
@kratsg kratsg self-assigned this Jul 27, 2021
@kratsg
Copy link
Contributor Author

kratsg commented Jul 27, 2021

Currently, this does [mu, uncorr_bkguncert[0], uncorr_bkguncert[1]] -- but we could/should discuss the various naming schema allowed. One could consider allowing the user to choose how the naming schema is done by allowing them to provide a format string.

Customizable via optional fstring kwarg.

@codecov
Copy link

codecov bot commented Jul 29, 2021

Codecov Report

Merging #1536 (01f9beb) into master (a0e2568) will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master    #1536   +/-   ##
=======================================
  Coverage   97.65%   97.66%           
=======================================
  Files          63       63           
  Lines        4006     4023   +17     
  Branches      565      571    +6     
=======================================
+ Hits         3912     3929   +17     
  Misses         55       55           
  Partials       39       39           
Flag Coverage Δ
contrib 25.42% <5.26%> (-0.09%) ⬇️
unittests 97.43% <100.00%> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
src/pyhf/optimize/opt_scipy.py 100.00% <ø> (ø)
src/pyhf/optimize/mixins.py 100.00% <100.00%> (ø)
src/pyhf/optimize/opt_minuit.py 100.00% <100.00%> (ø)
src/pyhf/pdf.py 96.54% <100.00%> (+0.10%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a0e2568...01f9beb. Read the comment docs.

Copy link
Member

@matthewfeickert matthewfeickert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor comments (which I'll check now on the formatting and merge if it looks okay), but this all looks good.

src/pyhf/pdf.py Outdated Show resolved Hide resolved
Copy link
Member

@matthewfeickert matthewfeickert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, as that checks out in the docs build this all LGTM. Thanks @kratsg!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat/enhancement New feature or request optimization tests pytest
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Propagating model parameter names to MINUIT
2 participants