-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handling of constraints for polydisperse parameters (fix #1588) #2348
Conversation
Now, editing the constraint on a poly param is possible.
Do a better check for polydisp. parameters.
constraint getter.
Created a dictionary within the FittingWidget class, named model_dict, to contain the main models of SasView. The keys are “standard”, “poly” and “magnet” relating to the main model, polydisperse model and magnet model respectively. Many functions in the code have been refactored to accept an argument, called model_key, which corresponds to a key in model_dict. This allows the model to be passed to a function without having to pass the bumps object, which can lead to the code misbehaving.
…training or removing a constraint on right click.
Required updating the getSymbolDict function in line with previous commits on this branch
Small edit to the getActiveConstraintList function
Last commit affects only the interface and should allow closing issue #2372. With this commit all parameters (either adjustable, fixed or constrained) will appear on the RHS of the Complex Constraint interface and can be used in the definition of a new constraint. It also fixes the FittingWidget.paramHasConstraint method, which was working for polydisperse parameters but not for the standard one. With the changes in this commit, this method is no longer useful, but I decided to keep it as well as the commented command in ComplexConstraint.setupParamWidgets, in case we could adopt a different logic in future and decide that constrained parameters should not be used in the definition of a new constraint. |
src/sas/sascalc/fit/BumpsFitting.py
Outdated
stderr.append(p.value.s) | ||
# p constrained based on another parameter | ||
# Warning: This will not work for parameters that are tied | ||
# to other parameter already constrained. | ||
# Is this still needed with the added _allComputedParamsUncertaintiesDefined test? | ||
else: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you ever get into the else
condition?
I think you can replace the entire if-else block with:
assert isinstance(p.value, (uncertainties.core.Variable, uncertainties.core.AffineScalarFunc))
# value.n returns value p
pvec.append(p.value.n)
# value.s returns error in p
stderr.append(p.value.s)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now that everything seems fixed, the execution flow is not entering the else condition any more in any of my tests. So I removed the else block entirely. But as a measure of precaution, I kept a try/except, so that if by any strange reason a parameter is still not an uncertainty object, at least the user will get the parameter value (+ some error message). But if this is judge completely unnecessary, I do not have any objection in removing the try/except.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My inclination would be to drop it, but I don't object to leaving it in.
In principle one should strive for 100% code coverage with tests. The except block is untestable (you can't trigger the except without extraordinary measures) . Let the higher up exception handler catch future broken code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's do that then. I just removed a few more lines :-)
I've only reviewed the code in sascalc not the GUI and I haven't tried running it, so I'll leave it to others to approve the PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It works as expected if the weights are not modified. When I changed weights 1.5 form M1, and then change them all to 1.0 weight for M2 and M3 were not reset (see attached image, parameters and log). I think it is unrealeted issue and therefore I would vote for merging this one and addressing weighting separetly.
The weighting is correct (at least the fact that the values shown in the log are not the same as those given by the user in the interface is the expected behaviour, not a bug). I realize that this is confusing, but the idea is that the weights shown in the log are the effective weights of each data set that are passed to bumps in order to try that each data set contributes to the fit with a "force" proportional to the user weights given by the user in the interface (sorry, not sure that this is very clear either!). When you give your 3 data sets a weight of 1, the effective weights shown in the log are automatically calculated to take into account the different number of points and error bars in each set, such that the 3 sets contribute more or less equally to the set. A clear documentation explaining how the weighting works is needed, but that is a separate issue. |
What is bit puzzling with weighting is that one cannot come to defaults vaules i.e. if all values are set to 1.0 in parameter table the info in log shows:
Once the values are modified e.g. M1 weight is set to 1.5 and then you run fit the values changes (that's expected) but when you change them all to 1.0 again the "INFO" weights won't come back to one. I agree is beyond the scope of this PR and may not be desired behavior. |
I figured one potential issue while doing the test on the latest code. This may be however more general issue with constrained fitting. When you set up restraint M2.radius = M1.radius and then in the M1 model panel deselect radius the fit still seems to start, however it returns the error: 0```
|
Concerning the weights, the way to recover the default values is to deactivate the 'Modify weighting' option. But I agree that the difference between the values in the log info and the user interface is puzzling. So this is something to discuss and probably to improve in the future. |
Well spotted the error when a parameter is tied to a parameter that it is not a free parameter. The final cause for the error is that the fix parameter is not an uncertainty object, so the tied param will not be it either, and after the last changes discussed with Paul K this result in an assertion error. The question is if such a constraint definition makes sense and if we should allow it or not. Note that fixed parameters are still allowed and correctly handled when used as part of a formula. E.g. defining So the quick fix here is to recover the try/except with a warning to the user that for a given parameter the uncertainty could not be computed, such that at least SasView does not stop. Otherwise, we need to address the root problem as suggested by Paul K, meaning that we need to implement a method to identify such problematic definitions and then decide what to do, as I can see several possibilities:
I can give a look to that, but I'm not sure that I can manage to find a useful solution in a reasonable time, so I would propose to go for the quick fix in this PR and then discuss in one of our calls if something better is needed. |
The constraints should be independent of fitting. That is, I want to be able to set the two radii to be equal (so tied by an expression) even if neither are selected for fitting. The GUI should allow this without having to toggle fitting on to set up the constraint and off again to suppress fitting. You can easily replace the assert with a condition, setting stderr to either NaN or zero for propagation of unconstrained values. I suggest replacing the entire # Note: Derived values will not have uncertainty if the inputs are fixed, so no 'n' or 's'.
fitting_result.pvec = np.array([getattr(p.value, 'n', p.value) for p in pars])
fitting_result.stderr = np.array([getattr(p.value, 's', 0) for p in pars])
DOF = max(1, fitness.numpoints() - len(fitness.fitted_pars))
fitting_result.fitness = np.sum(fitting_result.residuals ** 2) / DOF This produces the χ2 associated with the current set of parameter values, and the fitted parameters uncertainty estimates even if the fit quits early. Leave it to the GUI to decide how to inform the user on the basis of For now you can reproduce current behaviour by adding: # TODO: Let the GUI decided how to handle success/failure.
if not fitting_result.success:
fitting_result.stderr[:] = np.NaN
fitting_result.fitness = np.NaN There are other code simplifications that could be done but this is enough for now. |
The problem is that we may have some ambiguous situations, where it is not completely clear the user intention. Some examples and the current behaviour of SasView are:
In any case, the current behaviour does not seem unreasonable to me, so I would vote to keep it as it is. The errors that we have observed do not appear during the fitting, but in the post-processing of the uncertainties. I will try the new suggestion from Paul K. Otherwise, I will go back to a simple try/except block to take different actions depending if M1 is just a number (fixed parameter) or an uncertainty object (fitted param). This works also, and it is not very clear to me which information we would be throwing away. |
I applied the suggested changes and tested. It works fine. Only minor drawback that I found: Now the only sign that something "strange" with the constraints definitions could be happening, is that the error of the tied parameter will appear as 0 in the corresponding FitPage. We can discuss if it is needed to check this and send a explicit warning to the log explorer asking the user to verify the fit conditions. |
The existing constraint system is purely assignment. With a mathematical constraint such as Given that constraints are only being used to tie parameters between models and almost(?) never to calculate arbitrary expressions you could scrap the existing GUI and replace it with something easier to control. This could build up the set of expressions needed to apply the constraints. That is outside the scope of this PR and better left as a discussion. |
You are probably trying to query all constraints based on the model of the first constraint rather than on both models? |
@rozyczko Yes, this is something that I already noted on the code camp, but did not try to fix yet. I described the problem in issue 2355. All the work and testing in this PR was done using only the Const. & Simul. Fit tab. It's in my plans to try to solve that issue too, but I don't know when I'll have time. In the meantime, I would argue that this PR fixes the main problem of being able to constraint also polydisp. parameters and that I expect most users will use this functionality from the simultaneous tab, so issue 2355 is a minor problem and should not delay merging this PR. To discuss in the next call? |
Ah! OK then, as long as we have this listed as an issue then I agree it's better to have the functionality in, before the codebase diverges and we have to spend time solving merge conflicts again, considering how big this PR is. |
As discussed at the biweekly call we are merging this PR and saving related issues/discussion for the future. |
Description
Changes to handle correctly the constraints for polydisperse (PD) parameters.
Now PD params will appear in the combo box menu of the Complex Constraint window, and they will behave as the main parameters (constraints can be added, selected/deselected, edited, etc.)
Fixes #1588
How Has This Been Tested?
Tested on the 3 AOT data sets of the simultaneous 1D fitting tutorial, combining standard and PD constraints and trying the available options (check/uncheck, remove constraint, using a formula) and everything worked as expected.
Known issue
A pending problem is that the PD constrains are not loaded with the project, so they have to be reintroduced every time one loads a project. This is not unexpected, as there are comments (TODO: add polidyspersity and magnetism) in FittingPage.readFitPage and FittingPage.saveToFitPage. The PD constraints are nevertheless written into the project json file, but not loaded. But I suggest to merge this PR and open a separate issue for this.
Review Checklist (please remove items if they don't apply):