Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: Malformated JSON array with Chunking=True #8212

Closed
mvadu opened this issue Mar 27, 2017 · 2 comments
Closed

Bug: Malformated JSON array with Chunking=True #8212

mvadu opened this issue Mar 27, 2017 · 2 comments

Comments

@mvadu
Copy link
Contributor

mvadu commented Mar 27, 2017

Latest Nightly (1.3.0~n201703250800)

Steps to reproduce:

  1. Create a measurement with 50 entries
  2. Query with chunked=true , and chunk_size=10

Expected behavior:

{"results":[{"statement_id":0,"series":[{"name":"ChunkTest","tags":null,"columns":["time","Val"],"values":[["1490617504908565600","0"],["1490617504922077700","1"],["1490617504938096800","2"],["1490617504954622300","3"],["1490617504969139900","4"],["1490617504985336400","5"],["1490617505000853700","6"],["1490617505015877900","7"],["1490617505031914600","8"],["1490617505047468600","9"]],"partial":true}],"partial":true},
{"statement_id":0,"series":[{"name":"ChunkTest","tags":null,"columns":["time","Val"],"values":[["1490617505063274500","10"],["1490617505077942400","11"],["1490617505094766500","12"],["1490617505110072200","13"],["1490617505125037800","14"],["1490617505140589200","15"],["1490617505156633400","16"],["1490617505172432100","17"],["1490617505188221700","18"],["1490617505203536700","19"]],"partial":true}],"partial":true},
{"statement_id":0,"series":[{"name":"ChunkTest","tags":null,"columns":["time","Val"],"values":[["1490617505219432600","20"],["1490617505234789300","21"],["1490617505251885700","22"],["1490617505268944500","23"],["1490617505282987900","24"],["1490617505296537000","25"],["1490617505312332500","26"],["1490617505328791800","27"],["1490617505345209300","28"],["1490617505359656100","29"]],"partial":true}],"partial":true},
{"statement_id":0,"series":[{"name":"ChunkTest","tags":null,"columns":["time","Val"],"values":[["1490617505375771900","30"],["1490617505390766500","31"],["1490617505406404400","32"],["1490617505422708900","33"],["1490617505437906100","34"],["1490617505453047500","35"],["1490617505469419700","36"],["1490617505484953700","37"],["1490617505500062100","38"],["1490617505516507400","39"]],"partial":true}],"partial":true},
{"statement_id":0,"series":[{"name":"ChunkTest","tags":null,"columns":["time","Val"],"values":[["1490617505531509500","40"],["1490617505547601500","41"],["1490617505563125200","42"],["1490617505578034600","43"],["1490617505594250700","44"],["1490617505609260800","45"],["1490617505625294100","46"],["1490617505640319700","47"],["1490617505657393900","48"],["1490617505671827100","49"]],"partial":false}],"partial":false}]}

Actual behavior:

{"results":[{"statement_id":0,"series":[{"name":"ChunkTest","columns":["time","Val"],"values":[[1490584276233503400,0],[1490584276247138800,1],[1490584276263674700,2],[1490584276277906600,3],[1490584276293388400,4],[1490584276311209600,5],[1490584276326084200,6],[1490584276341423000,7],[1490584276357200800,8],[1490584276372130200,9]],"partial":true}],"partial":true}]}
{"results":[{"statement_id":0,"series":[{"name":"ChunkTest","columns":["time","Val"],"values":[[1490584276387552200,10],[1490584276403257200,11],[1490584276419363600,12],[1490584276434843200,13],[1490584276449978300,14],[1490584276466235800,15],[1490584276481253300,16],[1490584276496574000,17],[1490584276513170700,18],[1490584276528502000,19]],"partial":true}],"partial":true}]}
{"results":[{"statement_id":0,"series":[{"name":"ChunkTest","columns":["time","Val"],"values":[[1490584276544037300,20],[1490584276559343100,21],[1490584276574752900,22],[1490584276594558500,23],[1490584276606538000,24],[1490584276622501200,25],[1490584276638082500,26],[1490584276653840200,27],[1490584276669319700,28],[1490584276684097000,29]],"partial":true}],"partial":true}]}
{"results":[{"statement_id":0,"series":[{"name":"ChunkTest","columns":["time","Val"],"values":[[1490584276700208200,30],[1490584276716050800,31],[1490584276731527700,32],[1490584276746938100,33],[1490584276763240500,34],[1490584276778480700,35],[1490584276793635900,36],[1490584276809912100,37],[1490584276825775700,38],[1490584276841033300,39]],"partial":true}],"partial":true}]}
{"results":[{"statement_id":0,"series":[{"name":"ChunkTest","columns":["time","Val"],"values":[[1490584276856816700,40],[1490584276871956700,41],[1490584276887248800,42],[1490584276903294600,43],[1490584276918772500,44],[1490584276935431700,45],[1490584276950390900,46],[1490584276965931500,47],[1490584276981850300,48],[1490584276996947400,49]]}]}]}

The difference between the expected and actual are the additional results tags. Since the results is already marked as an array, the second set of partial results should not create another array element. This is causing the issue with deserializers like Newtonsoft.Json#1261

@mvadu mvadu changed the title Malformated JSON array with Chunking=True Bug: Malformated JSON array with Chunking=True Mar 29, 2017
@jsternberg
Copy link
Contributor

When chunked=true is sent, multiple JSON bodies are sent. I don't see the error. Some JSON parsers have difficulty with that and throw an error, but they really shouldn't.

We can't really change the format. First, sending the JSON chunked but as a single JSON body would defeat the purpose of chunked results. You would not be able to read any of the points until the entire JSON body was consumed. Second, that would break backwards compatibility.

I'm not sure this is a bug with InfluxDB as much as a problem with the JSON parsing library.

@mvadu
Copy link
Contributor Author

mvadu commented Jun 3, 2017

Close loop: this is an issue with default buffering by .Net http client. Without HttpCompletionOption.ResponseHeadersRead parameter, the GetAsync waits until entire response is buffered at client side. Which defeats the purpose of chunking. Setting the HttpCompletionOption and reading the response as a stream line by line works like a charm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants