-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Should BigDecimal always be normalized? #26
Comments
I think a bit more legwork needs to be done to justify the statements in http://speleotrove.com/decimal/decifaq1.html#tzeros in the context of JavaScript and the context of this proposal. For example,
seems, as an end-user concern, to be mostly targeted at formatting, and so could be solved in other ways (e.g. Similarly, most of the bullet points following that one really seem to be user-facing formatting concerns, or in one case database-facing serialization concerns. They are not strongly supportive of the notion that decimals should carry along precision information with them internally, throughout the program. I think it would be helpful to approach this from the perspective of observable consequences for JavaScript programmers. Those seem to be:
|
There's also round-tripping to external places where decimals are held in other formats. |
Good point. Although it sounds like from #16 the plan is already to have that format be non-interoperable between machines, so the bar already seems pretty low there in terms of what guarantees programmers can rely on. (I suppose you could also be referring to other serialization mechanisms besides the BigDecimalXArrays, but those seem to fall into the "formatting time" bucket I already included.) |
Representation of financial amounts could be also addressed with some F#-like units system, (extended numeric literals proposal seems to be a step in that direction), so storing precision within BigDecimal is not an only option. |
BigDecimal wouldn’t be storing currency either - precision seems to me in the same category: things that you need to preserve explicitly alongside the amount (the BigDecimal) |
@domenic The plan in that issue was to leave two options, and have DataView methods to select one or the other. This is parallel to TypedArrays and endianness. Anyway, that's just one serialization format for interacting with certain systems; you could do other things in JS as well. |
I'm unclear about the precision semantics for operations on mixed precision BigDecimal -- would it be right to assume that we take for |
@LeszekSwirski Those are the semantics I was picturing for |
Due to the feedback so far, I'd suggest that we switch to a normalized data model, as in #29 . |
Fabrice Bellard said the following about normalization:
|
I think that the advantages and disadvantages of normalized and unnormalized arithmetic at this part of the FAQ (first two questions) should be considered. These two questions more directly address the merits of normalized vs. unnormalized decimal arithmetic than the issue of trailing zeros does. These were apparently written by Mike Cowlishaw, but they are quite dated -- does he still think these advantages and disadvantages exist? Also, in my opinion, normalization of decimal numbers should only affect how those numbers are compared and formatted (in the sense that, say, 0.5 and 0.50 are the same as far as the application is concerned, but that either value is normalized by default to 0.5 at the moment of formatting); it should not preclude implementations from storing and operating on decimal numbers, internally, in either unnormalized or normalized representations. |
@peteroupc Good points, thanks for writing in.
I've seen no indication that he changed his mind. I don't think we'll get everyone in the world to agree on this tradeoff--that's probably why we see a diversity of answers among other systems with decimals. What's your opinion?
I agree. In general, we'll only specify the observable semantics, which will relate to arithmetic, comparison and formatting. Implementations will be free to use unnormalized or normalized logic internally, as normalization "commutes"--you could do it before or afterwards, and there will be the same answer. In everything I've seen him say and write, @waldemarhorwat has been very careful to phrase this as, there should be no way to distinguish among decimals in a cohort; I've been a little more sloppy and intuitive when I talk about these things. |
As mentioned in plenary and other conversations, Intl.PluralRules needs non-normalized decimals. "1 email" and "1.0 emails" are different. |
So toString exposes this, and comparision ignores. |
This proposal starts off with the idea that trailing zeroes should be preserved, for reasons discussed in various comments in the Decimal FAQ. Should we instead normalize BigDecimals, i.e., make "precision" not a meaningful part of the BigDecimal values, just of operations on the data? Some comments from @ljharb in various threads (#11 #12) made me suspect he was leaning in this direction.
The text was updated successfully, but these errors were encountered: