-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Validate incorrectly throws precision mismatch error for Table_Delimited #681
Comments
Reviewed this one and it is tricky. The table has the entry The tricky bit is, do we treat a space as a 0 or as an error. Obviously the char table is assuming space is 0 while delimited is not. From a PDS product definition point of view, which table is correct? |
Validate will still throw the error without the space (that's why I included the *_no_space.csv file), so there's some fundamental difference in the way the two table types are being read. While adding a 0 would solve the problem, I think that implies false precision. So there isn't currently a way for a delimited table with variable precision (and field_format, which SBN requires) to be valid. |
Yes, I was being rather white space agnostic in the sense that blank and a space are the same. For character tables, a space has to be provided while delimiters are optional. I do not know the PDS rules or concepts nor do I pretend to. Given just what is in this discussion, if the precision is defined to be 2 digits after the decimal but only one is provided, then either the missing has to be presumed 0 to make 2 digits or it is an error because it is not knowable what to use for the missing digit. If we can presume missing digits to be 0, then delimited table is in error. If we cannot presume it to be 0, then char table is in error. It all leans back on what is meant by the precision for PDS which is why I flubbed it off to @jordanpadams |
@benjhirsch I need to dust off some old brain cells, but I am pretty sure delimited tables cannot have whitespace as padding for a float/integer/etc. because the value in a delimited table includes everything within the delimiters. So in reality, the ff_del test with the whitespace padding should really probably throw a datatype mismatch error, not a precision error. |
@al-niessner for this ticket, I think we want to throw a datatype mismatch (since |
@jordanpadams I'm still seeing the same error being thrown.
|
@al-niessner it will be closed by #815. although, while I was testing it, it actually looks like we are not checking precision at all for character tables, so I am trying to figure that out now. |
Yeah, I saw later that #815 was a pull request and not an issue. Not sure why you think all character tables are not being checked but if you point me at it I may be able to help. |
@al-niessner it is actually a very very weird oddity yet again with the standard. I will add you to a thread I just started with @jshughes and Co. |
Checked for duplicates
Yes - I've already checked
🐛 Describe the bug
When performing content validation on a delimited table, Validate throws a field_value_format_precision_mismatch error if any value in a field has less precision than dictated by field_format.
🕵️ Expected behavior
Per StdRef 4B.1.2, field_format defines the maximum precision, not the only precision allowed:
Validate does correctly not throw an error if the same table is described by a Table_Character object.
StdRef 4C.2, which describes delimited tables, states:
...so the same behavior should apply.
📜 To Reproduce
Run Validate on the included labels. Check the validation report.
🖥 Environment Info
📚 Version of Software Used
Validate Version 3.2.0
🩺 Test Data / Additional context
Included in the .zip file are:
Table_Delimited label
Table_Character label
CSV table that produces the error for delimited but not character
CSV table with whitespace removed that exhibits the same behavior (just in case)
Validation reports showing the error
ff_test.zip
🦄 Related requirements
No response
⚙️ Engineering Details
No response
I&T
TestRail Test ID: T8681195
The text was updated successfully, but these errors were encountered: