You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
READ THESE INSTRUCTIONS FULLY. IF YOU DO NOT, YOUR ISSUE WILL BE CLOSED.
If you have not submitted a GitHub Issue to lmfit before, read this first.
DO NOT USE GitHub Issues for questions, it is only for bugs in the lmfit code!
Issues here are concerned with errors or problems in the lmfit code. We use it as our bug
tracker. There are other places to get support and help with using lmfit.
If you think something is an Issue, it probably is not an Issue. If the behavior you
want to report involves a fit that runs to completion without raising an exception but
that gives a result that you think is incorrect, that is almost certainly not an Issue.
Use the mailing list or GitHub discussions
page for questions about lmfit or things
you think might be problems. We don't feel obligated to spend our free time helping
people who do not respect our chosen work processes, so if you ignore this advice and post
a question as a GitHub Issue anyway, it is quite likely that your Issue will be closed and
not answered. If you have any doubt at all, do NOT submit an Issue.
To submit an Issue, you MUST provide ALL of the following information. If you delete any
of these sections, your Issue may be closed. If you think one of the sections does not
apply to your Issue, state that explicitly. We will probably disagree with you and insist
that you provide that information. If we have to ask for it twice, we will expect it to be
correct and prompt.
First Time Issue Code
Yes, I read the instructions and I am sure this is a GitHub Issue.
Description
Calling the function eval_uncertainty() produces an Attribute Error, but only when a ModelResult object is loaded. Otherwise, the function works as expected on an existing ModelResult.
Matt Newville confirmed this bug on the lmfit mailing list (linked below) and mentioned this solution:
File "..../lmfit/model.py", line 1598, in eval_uncertainty for comp in self.components:
Should be: for comp in self.model.components:
A Minimal, Complete, and Verifiable example
x = np.array([1, 2, 3, 4, 5])
y = np.array([1, 2, 3, 4, 5])
model = LinearModel()
result = model.fit(y, params, x = x)
save_modelresult(result, "save_result.sav")
print(result.eval_uncertainty(x = x) # THIS WORKS
loaded_result = load_modelresult("save_result.sav")
print(loaded_result.eval_uncertainty(x = x)) # THIS THROWS ATTRIBUTE ERROR
Fit report:
Error message:
File "..../lmfit/model.py", line 1598, in eval_uncertainty
for comp in self.components:
AttributeError: 'ModelResult' object has no attribute 'components'
DO NOT IGNORE
READ THESE INSTRUCTIONS FULLY. IF YOU DO NOT, YOUR ISSUE WILL BE CLOSED.
If you have not submitted a GitHub Issue to lmfit before, read this first.
DO NOT USE GitHub Issues for questions, it is only for bugs in the lmfit code!
Issues here are concerned with errors or problems in the lmfit code. We use it as our bug
tracker. There are other places to get support and help with using lmfit.
If you think something is an Issue, it probably is not an Issue. If the behavior you
want to report involves a fit that runs to completion without raising an exception but
that gives a result that you think is incorrect, that is almost certainly not an Issue.
Use the mailing list or GitHub discussions
page for questions about lmfit or things
you think might be problems. We don't feel obligated to spend our free time helping
people who do not respect our chosen work processes, so if you ignore this advice and post
a question as a GitHub Issue anyway, it is quite likely that your Issue will be closed and
not answered. If you have any doubt at all, do NOT submit an Issue.
To submit an Issue, you MUST provide ALL of the following information. If you delete any
of these sections, your Issue may be closed. If you think one of the sections does not
apply to your Issue, state that explicitly. We will probably disagree with you and insist
that you provide that information. If we have to ask for it twice, we will expect it to be
correct and prompt.
First Time Issue Code
Yes, I read the instructions and I am sure this is a GitHub Issue.
Description
Calling the function eval_uncertainty() produces an Attribute Error, but only when a ModelResult object is loaded. Otherwise, the function works as expected on an existing ModelResult.
Matt Newville confirmed this bug on the lmfit mailing list (linked below) and mentioned this solution:
File "..../lmfit/model.py", line 1598, in eval_uncertainty
for comp in self.components:
Should be:
for comp in self.model.components:
A Minimal, Complete, and Verifiable example
Fit report:
Error message:
Version information
lmfit: 1.2.2, scipy: 1.11.1, numpy: 1.25.1,asteval: 0.9.31, uncertainties: 3.1.7
Link(s)
https://groups.google.com/g/lmfit-py/c/1Brvz0YamOs/m/u_1g_bubCwAJ?utm_medium=email&utm_source=footer
The text was updated successfully, but these errors were encountered: