Fix ±0 is displayed for results within a range without uncertainty set #2426
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description of the issue/feature this PR addresses
This Pull Request addresses a bug introduced with #2096. Until then, it was possible to define an uncertainty range without an Uncertainty value set:
This was handy for when the user wanted the system to not display the uncertainty for certain ranges (e.g when the result is between the detection limit and the quantification limit). With latest version, the system converts empties to 0 automatically:
And even though a
0
is meaningless as uncertainty, it gets displayed in results view and in results report.This Pull Request makes the system to not render uncertainties if the uncertainty value is not above 0.
Current behavior before PR
System does render uncertainty of 0 or below 0 in results view and report
Desired behavior after PR is merged
System does not render uncertainty of 0 or below 0 in results view and report
--
I confirm I have tested this PR thoroughly and coded it according to PEP8
and Plone's Python styleguide standards.