You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
These two behaviors shouldn't differ. It's not actually obvious which should be preferred. Due to current implementation details, the value decimal is always normalized before determining the least significant decimal place, while in the value/uncertainty mode the normalization does not necessarily happen.
My inclination is that normalization should happen at the first input step and at the formatting step to guarantee equal inputs get equal outputs. The argument against this is that one of the purposes of the Decimal encoding is to preserve significance of digits which are not technically significant digits.
The text was updated successfully, but these errors were encountered:
jagerber48
changed the title
value/uncertainty AutoDigits does not normalize decimal input before finding bottom digit but value AutoDigits does
Bug: value/uncertainty AutoDigits does not normalize decimal input before finding bottom digit but value AutoDigits does
Feb 8, 2024
See
These two behaviors shouldn't differ. It's not actually obvious which should be preferred. Due to current implementation details, the value decimal is always normalized before determining the least significant decimal place, while in the value/uncertainty mode the normalization does not necessarily happen.
My inclination is that normalization should happen at the first input step and at the formatting step to guarantee equal inputs get equal outputs. The argument against this is that one of the purposes of the
Decimal
encoding is to preserve significance of digits which are not technically significant digits.The text was updated successfully, but these errors were encountered: