Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove lambda-dependency from all phenomenological background models #43

Merged
merged 7 commits into from
Oct 23, 2020

Conversation

luisfabib
Copy link
Collaborator

No description provided.

@luisfabib luisfabib added this to the 0.12.0 milestone Oct 20, 2020
@luisfabib luisfabib linked an issue Oct 20, 2020 that may be closed by this pull request
@stestoll stestoll marked this pull request as draft October 20, 2020 16:34
@luisfabib
Copy link
Collaborator Author

The lambda dependency has been removed from all phenomenological background models. However, they still can take lambda as a third input. It has no effect on the math, so it is basically useless in the phenomenological models.

Now we have two alternatives:

  • Keep the lambda input for the phenomenological models even though it has no effect whatsoever. This would profit of a cleaner code everywhere else.
  • Remove the lambda input for the phenomenological models. This would require all functions which use these models, e.g. fitsignal, to have a check of hard-coded model-names to recognize which ones take lambda and which ones not.

At the moment I'm slightly in favor of the first one, provided that the documentation clearly shows that phenomenological models do not depend on lambda. This would solve the issue, albeit I would agree not cleanly. It would give us though time to find a cleaner and more elegant solution while the "math" is already correct.

@stestoll
Copy link
Collaborator

stestoll commented Oct 22, 2020

I think the second alternative is better. A function shouldn't take a positional argument if it doesn't use it. But I understand that this creates some other issues. Maybe signature from the inspect module, or something similar, could help.

@luisfabib
Copy link
Collaborator Author

Did not know about the inspect.signature function. Yeah, that should take care of most potential issues.

@luisfabib luisfabib marked this pull request as ready for review October 23, 2020 11:36
@luisfabib luisfabib requested a review from stestoll October 23, 2020 11:36
@stestoll stestoll merged commit 5ac739c into main Oct 23, 2020
stestoll pushed a commit that referenced this pull request Oct 23, 2020
* docs: major edits, included license, and beginners guide section

* fitsignal: experiment parameter are now properly printed when verbose (bug fix)

* docs: rewritten uncertainty section

* docs: minor edits and spelling corrections

* docs: rewriten Basics section, edits on other pages

* docs: search engine highlighting no longer introduces spaces (bug fix)

* docs: minor edits

* docs: adapted phenomenological background model equations to #43

* docs: minor edits in frontpage

* docs: fixed example in fitparamodel docstring (#47)
@stestoll stestoll deleted the feature/phenom_bckg_lambda_removal branch October 23, 2020 18:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Remove lambda parameter from phenomenological background models
2 participants