You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, Argus accepts json payloads with duplicate object names e.g. {"identifier" : "minimal0","data": {"minnie":"mouse"},"ttl":300,"identifier" : "minimal1","data": {"mickey":"mouse"},"ttl":600}
This is what's stored: {"identifier":"minimal1","data":{"mickey":"mouse","minnie":"mouse"},"ttl":572}
Non-unique names appeared to be allowed in json, but this results in inconsistent behavior. The value for identifier & ttl are replaced by the later duplicate, but data field values are merged.
The text was updated successfully, but these errors were encountered:
This really isn't a server bug per se. It's how the encoding/json package behaves. Incidentally, this behavior is common: most languages and platform handle duplicate fields by taking the last occurrence of that field in the stream. The JSON RFC is silent on how to unmarshal messages with duplicate fields. This is just the most common choice made by API and framework engineers.
Any way we would try to fix this in a server would result in inconsistent behavior from someone's point of view. Arguably, this is a client bug. It would be nice for our server software to detect it, but that's not really feasible given the current state of things.
Currently, Argus accepts json payloads with duplicate object names e.g.
{"identifier" : "minimal0","data": {"minnie":"mouse"},"ttl":300,"identifier" : "minimal1","data": {"mickey":"mouse"},"ttl":600}
This is what's stored:
{"identifier":"minimal1","data":{"mickey":"mouse","minnie":"mouse"},"ttl":572}
Non-unique names appeared to be allowed in json, but this results in inconsistent behavior. The value for identifier & ttl are replaced by the later duplicate, but data field values are merged.
The text was updated successfully, but these errors were encountered: