-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
validate rounded numeric values #111
Comments
- added separate test class for numeric - added test class to test the numericRounding methode
This new behaviour may lead to duplicate coord errors which actually do not exist in the data. It was my understanding that the precision of coordinates in ILI2 is not restricted. So a coordinate like
was considered VALID against a definition of
From my point of view, the precision should be enforced with the input data. So the sending application is responsible to avoid duplicate coords and deliver coordinates with the precision defined in the model. Please reopen to further discuss this topic. |
It is valid to encode numeric values with additional precision. (Because of extended models). Nevertheless the data must still be valid against the base definition/model. Therefore the encoded value is rounded and then then this rounded value is validated. |
Ok, I understand the mechanism with base definitions and your explanation makes sense. So I suggest that when ilivalidator reports duplicate coord errors, the coordinates of the error location should be reported with the original precision of the input data and not with the rounded precision (as it is done now with ilivalidator v1.9.0). Getting the error location with the orginial precision helps localise the error in the data visually (using a GIS) and textually (in the xtf file). The rounded coordinates do not show up in the original data and are therefore less usefull. |
Would be nice, but big effort (because the rounding and some of the validations (e.g. uniqueness checks) are in very different code locations) |
While upgrading our web service to v1.9.2 I run into some problems with this new behaviour. When dealing with "Einlenker" of streets that are segmentized and you just copy/paste them from cadastral surveying shapefile (or similar) to e.g. land use planning, you just end up with these "duplicate duplicate coord at" errors. This will lead to misshapen "Einlenker" when you have to delete some of the vertices. And I fear this will be very hard to communicate as a whole... |
Round the value as read from the XTF to the accuracy as defined by the ili model and then validate against that rounded value.
The text was updated successfully, but these errors were encountered: