Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System.Text.Json.JsonException The JSON value could not be converted to System.String. Path: $.error.code | LineNumber: 0 | BytePositionInLine: 20. #408

Closed
SunnyLeu opened this issue Nov 8, 2023 · 32 comments
Labels
bug Something isn't working enhancement New feature or request

Comments

@SunnyLeu
Copy link

SunnyLeu commented Nov 8, 2023

Describe the bug
I use the api as normal, but it shows System.Text.Json.JsonException
The JSON value could not be converted to System.String. Path: $.error.code | LineNumber: 0 | BytePositionInLine: 20.

Your code piece

ChatCompletionCreateRequest chatCompletionCreateRequest = new ChatCompletionCreateRequest {
    Messages = new List<ChatMessage>(),
    Model = Model,
};

ChatCompletionCreateResponse chatCompletionCreateResponse;
try {
    chatCompletionCreateResponse = await openAiService.ChatCompletion.CreateCompletion(chatCompletionCreateRequest);
} catch (Exception ex) {
    Console.WriteLine($"{ex.GetType().FullName}<br>{ex.Message}");
    throw;
}

Result
System.Text.Json.JsonException
The JSON value could not be converted to System.String. Path: $.error.code | LineNumber: 0 | BytePositionInLine: 20.
InvalidOperationException: Cannot get the value of a token type 'Number' as a string.
at System.Text.Json.ThrowHelper.ReThrowWithPath(ReadStack& state, Utf8JsonReader& reader, Exception ex)
at System.Text.Json.Serialization.JsonConverter`1.ReadCore(Utf8JsonReader& reader, JsonSerializerOptions options, ReadStack& state)
at System.Text.Json.JsonSerializer.ContinueDeserialize[TValue](ReadBufferState& bufferState, JsonReaderState& jsonReaderState, ReadStack& readStack, JsonTypeInfo jsonTypeInfo)
at System.Text.Json.JsonSerializer.ReadFromStreamAsync[TValue](Stream utf8Json, JsonTypeInfo jsonTypeInfo, CancellationToken cancellationToken)
at System.Net.Http.Json.HttpContentJsonExtensions.ReadFromJsonAsyncCore[T](HttpContent content, Encoding sourceEncoding, JsonSerializerOptions options, CancellationToken cancellationToken)
at OpenAI.Extensions.HttpClientExtensions.PostAndReadAsAsync[TResponse](HttpClient client, String uri, Object requestModel, CancellationToken cancellationToken)
at OpenAI.Managers.OpenAIService.CreateCompletion(ChatCompletionCreateRequest chatCompletionCreateRequest, String modelId, CancellationToken cancellationToken)
at 【MyCode】

Expected behavior
API returns ChatGPT's reply

Screenshots
image

Desktop (please complete the following information):

  • OS: Windows 11
  • Language C#
@SunnyLeu
Copy link
Author

SunnyLeu commented Nov 8, 2023

image https://status.openai.com

OK, I think I found the problem, OpenAI is having issue of their services.

@mikemike396
Copy link

Seems to be back up. Error went away for us.

@kayhantolga
Copy link
Member

seems like openai outage, if not please reopen the issue

@kayhantolga kayhantolga closed this as not planned Won't fix, can't repro, duplicate, stale Nov 9, 2023
@xbaha
Copy link

xbaha commented Nov 11, 2023

Re-pen the issue.

It's not OpenAI outage problem.,
I just received the exact error:

Message | "The JSON value could not be converted to System.String. Path: $.error.code \| LineNumber: 0 \| BytePositionInLine: 20." 
		SerializationStackTraceString	"   at System.Text.Json.ThrowHelper.ReThrowWithPath(ReadStack& state, Utf8JsonReader& reader, Exception ex)\r\n   at System.Text.Json.Serialization.JsonConverter`1.ReadCore(Utf8JsonReader& reader, JsonSerializerOptions options, ReadStack& state)\r\n   at System.Text.Json.Serialization.Metadata.JsonTypeInfo`1.ContinueDeserialize(ReadBufferState& bufferState, JsonReaderState& jsonReaderState, ReadStack& readStack)\r\n   at System.Text.Json.Serialization.Metadata.JsonTypeInfo`1.DeserializeAsync(Stream utf8Json, CancellationToken cancellationToken)\r\n   at System.Net.Http.Json.HttpContentJsonExtensions.ReadFromJsonAsyncCore[T](HttpContent content, JsonSerializerOptions options, CancellationToken cancellationToken)\r\n   at OpenAI.Extensions.HttpClientExtensions.PostAndReadAsAsync[TResponse](HttpClient client, String uri, Object requestModel, CancellationToken cancellationToken)\r\n   at OpenAI.Managers.OpenAIService.CreateCompletion(ChatCompletionCreateRequest chatCompletionCreateRequest, String modelId, CancellationToken cancellationToken)\r\n   at YouTubeSponseredByScheduler.Tasks.ChatGPT.GenerateResponse2(String FromSystem, String FromUser, Nullable`1 Temperature, Int32 modelSize) 

When i am sending around 2k tokens and expecting back another 3-4k tokens, the API works, once i increase the tokens > 4k and expect back 6k tokens i get this error.
the request took about 7-8 minutes to get the response.
in openAI, the request seems to be processed because the tokens are in my usage and I got charged for it.
so the only problem left is json parser,
if someone can let me know where can i find the data received before it was processed to json object, i might be able to debug the json problem.

@kayhantolga kayhantolga reopened this Nov 11, 2023
@kayhantolga
Copy link
Member

@xbaha, I believe you are encountering a different issue, but I will keep this issue open. I think the SDK should provide more information about errors in such cases. If possible, it would be helpful to reproduce the issue in my client. If your prompt is not private, could you share it? Alternatively, You can use Laser Cat Eyes to debug APIs (sample usage available in the playground). You can observe the incoming response and share it with us.

@kayhantolga kayhantolga added bug Something isn't working enhancement New feature or request labels Nov 11, 2023
@xbaha
Copy link

xbaha commented Nov 11, 2023

hi, Thank you for reopening this issue.
I have uploaded my prompt here:

https://we.tl/t-S31MpkODAV

use gpt3.5 turbo 16k, temp=0.0f
add this to the client because it takes long time:

            var openAiService = new OpenAIService(new OpenAiOptions()
            {
                ApiKey = apiKey2
            }, 
            new HttpClient()
            {
                Timeout = TimeSpan.FromMinutes(50)
            });

I just tested it and I got this error.

@roldengarm
Copy link

Just wanted to add that error handling is not ideal at the moment by the looks of it. The SDK seems to throw an InvalidOperationException when OpenAI returns a 502 or 503 error. Screenshot below from our App Insights logs during the outage last week.
image

Is there any way to catch the actual error thrown by OpenAI, such that we can show a proper message in the UI?
We're still on betalgo.openai.gpt3

@kayhantolga
Copy link
Member

@roldengarm OpenAI is supposed to return an error message and code when encountering issues. These fields are available through the SDK response. However, in cases where a proper response cannot be returned due to an outage or internal error, the SDK is unable to provide more details. I am planning to improve this behavior in the future, but for now, my suggestion is to return an unexpected error message :/

@kayhantolga
Copy link
Member

@xbahaI attempted to use your sample, and it took 8 minutes to receive a response, but it was successful. The only exception I was able to reproduce was a timeout error when I set the timeout value to less than 8 minutes.
image

@xbaha
Copy link

xbaha commented Nov 14, 2023

@kayhantolga have you tried it with the Nuget package? I never tried laser CatEyes and not sure why you tried it there because the prompt works as i mentioned works in openAi but not in this c# package. , if you can try it on c# and see if you get an exception or not.

@kayhantolga
Copy link
Member

@xbaha Lasercateyes is a tool that simply displays incoming and outgoing data. I tried using the source code(in playground project), but now I'm going to try using the NuGet package again. They should behave in the same way, but let's see.

@StefanCop
Copy link

For analysis I enhanced in HttpClientExtensions the method with a try-catch to get the JsonException and include HTTP status code:

    try
      {
          return await response.Content.ReadFromJsonAsync<TResponse>(cancellationToken: cancellationToken) ?? throw new InvalidOperationException();
      }
  	catch (System.Text.Json.JsonException ex)
      {
          string content = "<<unknown>>";
          try { content = response.Content.ReadAsStringAsync().Result; } catch { }
  		throw new InvalidOperationException($"{ex.Message}, Http {response.StatusCode} {response.ReasonPhrase}: {content}", ex);
  	}

I most cases it has been a 'bad gateway' http error status and reason, which was the cause of the JsonException.

@asiryan
Copy link

asiryan commented Nov 16, 2023

System.Text.Json.JsonException: ''<' is an invalid start of a value. Path: $ | LineNumber: 0 | BytePositionInLine: 0.'
JsonReaderException: '<' is an invalid start of a value. LineNumber: 0 | BytePositionInLine: 0.

@asiryan
Copy link

asiryan commented Nov 16, 2023

@kayhantolga i got this error too

@asiryan
Copy link

asiryan commented Nov 16, 2023

@kayhantolga why you don’t use Newtonsoft.Json?

@i542873057
Copy link

i got this error too

@StefanCop
Copy link

For analysis I enhanced in HttpClientExtensions the method with a try-catch to get the JsonException and include HTTP status code:

    try
      {
          return await response.Content.ReadFromJsonAsync<TResponse>(cancellationToken: cancellationToken) ?? throw new InvalidOperationException();
      }
  	catch (System.Text.Json.JsonException ex)
      {
          string content = "<<unknown>>";
          try { content = response.Content.ReadAsStringAsync().Result; } catch { }
  		throw new InvalidOperationException($"{ex.Message}, Http {response.StatusCode} {response.ReasonPhrase}: {content}", ex);
  	}

I most cases it has been a 'bad gateway' http error status and reason, which was the cause of the JsonException.

I have added this catch JsonException to find the cause. In the cases I had, it was always HTTP status code >299 (like 502 bad gateway). To me it makes sense that in http error cases there's no valid Json.

@belaszalontai
Copy link
Contributor

System.Text.Json.JsonException: ''<' is an invalid start of a value. Path: $ | LineNumber: 0 | BytePositionInLine: 0.' JsonReaderException: '<' is an invalid start of a value. LineNumber: 0 | BytePositionInLine: 0.

This exception means that the response is an html code instead of json. As @StefanCop mensioned it could be an http error with html error page in the body.

@xbaha
Copy link

xbaha commented Nov 17, 2023

@belaszalontai the response works in Lasercateyes , this means it was returned as a valid JSON to this nugget package, what happens later with parsing it is the problem.

@PabloOteroDeMiguel
Copy link

System.Text.Json.JsonException The JSON value could not be converted to System.String. Path: $.error.code | LineNumber: 0 | BytePositionInLine: 20.

Hello everyone, I'm having same issue in 1/3 of my requests for last two days.

var completionResult = await 
openAiService.ChatCompletion.CreateCompletion(new ChatCompletionCreateRequest
{
    Messages = new List<ChatMessage>
    {
        ChatMessage.FromSystem(system),
        ChatMessage.FromUser(OCR),

    },
    Model = Models.Gpt_3_5_Turbo_16k,
    MaxTokens = 1000,
    Temperature = 0.4f,
    TopP = 1,
    PresencePenalty = 0,
    FrequencyPenalty = 0,
});

Any update about it? Or idea how to solve it?

@DanDiplo
Copy link

DanDiplo commented Dec 1, 2023

Just to note I've seen this issue, too. It seems sporadic and not easily reproduceable, given the non-deterministic way that even the same prompt can generate different results. But it does happen and I suspect it is to do with the size of the response.

System.Text.Json.JsonException: '<' is an invalid start of a value. Path: $ | LineNumber: 0 | BytePositionInLine: 0.
 ---> System.Text.Json.JsonReaderException: '<' is an invalid start of a value. LineNumber: 0 | BytePositionInLine: 0.
   at System.Text.Json.ThrowHelper.ThrowJsonReaderException(Utf8JsonReader& json, ExceptionResource resource, Byte nextByte, ReadOnlySpan`1 bytes)
   at System.Text.Json.Utf8JsonReader.ConsumeValue(Byte marker)
   at System.Text.Json.Utf8JsonReader.ReadFirstToken(Byte first)
   at System.Text.Json.Utf8JsonReader.ReadSingleSegment()
   at System.Text.Json.Utf8JsonReader.Read()
   at System.Text.Json.Serialization.JsonConverter`1.ReadCore(Utf8JsonReader& reader, JsonSerializerOptions options, ReadStack& state)
   --- End of inner exception stack trace ---
   at System.Text.Json.ThrowHelper.ReThrowWithPath(ReadStack& state, JsonReaderException ex)
   at System.Text.Json.Serialization.JsonConverter`1.ReadCore(Utf8JsonReader& reader, JsonSerializerOptions options, ReadStack& state)
   at System.Text.Json.JsonSerializer.ReadCore[TValue](JsonConverter jsonConverter, Utf8JsonReader& reader, JsonSerializerOptions options, ReadStack& state)
   at System.Text.Json.JsonSerializer.ReadCore[TValue](JsonReaderState& readerState, Boolean isFinalBlock, ReadOnlySpan`1 buffer, JsonSerializerOptions options, ReadStack& state, JsonConverter converterBase)
   at System.Text.Json.JsonSerializer.ContinueDeserialize[TValue](ReadBufferState& bufferState, JsonReaderState& jsonReaderState, ReadStack& readStack, JsonConverter converter, JsonSerializerOptions options)
   at System.Text.Json.JsonSerializer.ReadAllAsync[TValue](Stream utf8Json, JsonTypeInfo jsonTypeInfo, CancellationToken cancellationToken)
   at System.Net.Http.Json.HttpContentJsonExtensions.ReadFromJsonAsyncCore[T](HttpContent content, Encoding sourceEncoding, JsonSerializerOptions options, CancellationToken cancellationToken)
   at OpenAI.Extensions.HttpClientExtensions.PostAndReadAsAsync[TResponse](HttpClient client, String uri, Object requestModel, CancellationToken cancellationToken)
   at OpenAI.Managers.OpenAIService.CreateCompletion(ChatCompletionCreateRequest chatCompletionCreateRequest, String modelId, CancellationToken cancellationToken)

@belaszalontai
Copy link
Contributor

belaszalontai commented Dec 1, 2023

As I said, this error message

System.Text.Json.JsonException: '<' is an invalid start of a value. Path: $ | LineNumber: 0 | BytePositionInLine: 0.
---> System.Text.Json.JsonReaderException: '<' is an invalid start of a value. LineNumber: 0 | BytePositionInLine: 0.
at System.Text.Json

means that the BODY of the response is HTML instead of JSON.

I suspect that somtimes the response from OpenAI API is not in JSON format some reason but in HTML.
What if you explicitly set the response format in the request?
https://platform.openai.com/docs/api-reference/chat/create#chat-create-response_format

On the other hand if it wont help then we need a stable reproducible code or data or just a request to be able to find the root cause.

@DanDiplo
Copy link

DanDiplo commented Dec 1, 2023

@belaszalontai I think you are on to something, as I'm asking in the prompt for the output to be formatted as HTML, but I'm not explicitly setting the output format (so would assume it would return as "text" but formatted as HTML). For most cases, this seems to work fine as it returns text formatted as HTML. So I can run the same prompt 10 times and 9 times will work, but just occasionally it fails. I'll look at explicitly setting the output format as text, to see if that makes any difference.

So basically I'm passing it a load of text and then saying (as a system prompt):

"You will output your response formatted as HTML (but not an entire document) without any other text."

I'm using gpt-4-1106-preview as the model and not explicitly setting anything else except the token limit (which is 2500). If that helps.

@belaszalontai
Copy link
Contributor

@DanDiplo I looked it up and the openAI API will always return JSON in response BODY, and the content property within the choices array will contain the HTML page. So my response_format idea makes no sense. I also don't want to assume that the content property is wrongly escaped in the OpenAI's response.
Not least because if we look again the error message:

'<' is an invalid start of a value. LineNumber: 0 | BytePositionInLine: 0.

The line and character position is 0. So I think the parser found a native HTML code in response BODY instead of a proper chat completion object.

We need to log somehow the raw response from OpenAI to be able to figure out what is received and why the SDK is not able to parse.

@xbaha
Copy link

xbaha commented Dec 1, 2023

@belaszalontai I think the sdk has a serious bug, i suspect it has to do with the response size, going back to my example, i was never able to get a successful reply with this SDK, i tried another sdk "openAIDotNet" and i got the exact same error everytime i called it, then i decided to write the GenerateResponse function from scratch using HttpClientHandler/HttpRequestMessage, and it worked every time and i have never seen the error since. i already used it over 5,000 times.

@belaszalontai
Copy link
Contributor

@xbaha Am I understand well that you have a root cause and a solution for this issue? Can you please share your GenerateResponse "function"?

@xbaha
Copy link

xbaha commented Dec 2, 2023

sure, but it is customized for my own use case without SDK or anything:


      public async Task<string> GenerateResponse(string FromSystem, string FromUser, float? Temperature, string format, int modelSize)
        {
            string endpoint = "https://api.openai.com/v1/chat/completions";
            List<Message> messages = new List<Message>
                {
                    new Message
                    {
                        role = "system",
                        content = FromSystem
                    },
                    new Message
                    {
                        role = "user",
                        content = FromUser
                    }
                };
            string? model = null;
            if (modelSize == 4)
            {
                model = "gpt-3.5-turbo";
            }
            else if (modelSize == 16)
            {
                model = "gpt-3.5-turbo-1106";
            }
            else if (modelSize == 128)
            {
                model = "gpt-4-1106-preview";
            }

            string? responseFormat = null;
            if (format == "json")
            {
                responseFormat = "json_object";
            }


            using (HttpClientHandler clientHandler = new HttpClientHandler())
            {
                clientHandler.ClientCertificateOptions = ClientCertificateOption.Manual;
                clientHandler.ServerCertificateCustomValidationCallback = (sender, cert, chain, sslPolicyErrors) => true;
                clientHandler.SslProtocols = System.Security.Authentication.SslProtocols.Tls12;

                using (HttpClient client = new HttpClient(clientHandler))
                {
                    client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", apiKey2);
                    client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
                    // Set a timeout of 20 minutes
                    client.Timeout = TimeSpan.FromMinutes(20);

                    var requestBody = new
                    {
                        model = model,
                        messages = messages,
                        //response_format = new { type = responseFormat },
                        temperature = Temperature,
                        stream = true,
                    };

                    var json = JsonConvert.SerializeObject(requestBody);
                    var content = new StringContent(json, Encoding.UTF8, "application/json");

                    string reply = string.Empty;

                    try
                    {
                        using (var request = new HttpRequestMessage(HttpMethod.Post, endpoint) { Content = content })
                        {
                            using (var response = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead))
                            {
                                response.EnsureSuccessStatusCode();

                                // Stream the response.
                                using (var responseStream = await response.Content.ReadAsStreamAsync())
                                {
                                    using (var reader = new StreamReader(responseStream))
                                    {
                                        string line;
                                        while ((line = await reader.ReadLineAsync()) != null)
                                        {
                                            Console.WriteLine(line);
                                            // Remove the "data: " prefix from the line
                                            if (!string.IsNullOrEmpty(line))
                                            {
                                                string jsonLine = line.Substring(6);

                                                // Parse the JSON data
                                                JsonDocument document = JsonDocument.Parse(jsonLine);

                                                try
                                                {
                                                    // Check if the finish_reason is "length"
                                                    JsonElement root = document.RootElement;
                                                    if (root.GetProperty("choices")[0].GetProperty("finish_reason").GetString() != null)
                                                    {
                                                        // Stop the loop
                                                        break;
                                                    }

                                                    // Extract the content
                                                    reply += root.GetProperty("choices")[0].GetProperty("delta").GetProperty("content").GetString();
                                                }
                                                catch (KeyNotFoundException ex)
                                                {
                                                    Console.WriteLine("KeyNotFoundException: " + ex.Message);
                                                }
                                            }
                                        }
                                    }
                                }
                            }
                        }
                        return reply;
                    }
                    catch (Exception ex)
                    {
                        Console.WriteLine("Exception: " + ex.Message);
                    }

                    return reply;

                }
            }
        }

```'

@belaszalontai
Copy link
Contributor

@xbaha
Thnx

I have analyzed your code and compared to the betalgo's CreateCompletionAsStream method in OpenAIChatCompletions.cs class
The main difference is only in the consumer code of the stream. (while loop)

Anyway both code implements the SSE consumption wrong, because they handling only data events and do not consider the possible other event fields and the keep alive messages starting with colon. See: https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#event_stream_format

In your code you just remove the first 6 character from each and every line with this code: string jsonLine = line.Substring(6); which works for you some reason. And it is also an issue that your code stops consuming the stream parsing the finish_reason, but the correct way is to wait until the [DONE] message.

The CreateCompletionAsStream method also does not handle other SSE events than "data:". I think this is the root cause but I have no proof. We need a raw log of messages (with event names) of stream coming from OpenAI to be sure.
Because what if OpenAI want to keep the connection alive with a message started with collon or lost the connection and just want to retry with a "retry:" message?
And I think there is another possible bug in this code in line 66 where started with line += The Json parser has error and in the catch block the code in line 67 try to parse the same unparsable line again. We need here a line =. Correct me if I am wrong.

@kayhantolga what is your oppinion?

@xbaha
Copy link

xbaha commented Dec 3, 2023

do you mean reply += and not line += ?
the only thing i don't like about the code is

string jsonLine = line.Substring(6);

not sure if this is universal or i was just lucky.

i invite anyone who got the json error to try this code and give feedback @DanDiplo @PabloOteroDeMiguel @i542873057 @asiryan

@belaszalontai
Copy link
Contributor

@xbaha line+= @ line 66 is a possible bug in Betalgo's SDK codebase not in yours. Sorry that I wasn't clear.

@PabloOteroDeMiguel
Copy link

Hi. I only did one thing. I increased the tokens by double and the error has practically disappeared maybe once in 100 request.
I hope this can help.

@kayhantolga
Copy link
Member

I have a couple of comments which hopefully explain things.

  1. Whenever I encounter the error message "*<' is an invalid start of a value. LineNumber: 0 | BytePositionInLine: 0.", I suspect that openAI is returning HTML instead of JSON due to an internal error in the openAI API.
  2. In this scenario, it would make sense for the SDK to throw an exception. However, I understand that this may not provide the best user experience. Therefore, I am planning to modify this behavior to return the HTML as an error message instead of throwing an error in such cases.
  3. I attempted to reproduce the @xbaha sample case, but it worked for me without any error. I'm not sure why @xbaha is encountering an error. It could be related to their account, location, or some other factor.
  4. @xbaha, I tried your case with the NuGet package. "Lasercateyes" is just an extension/tool that shows request and response details. It is a monitoring tool and there is nothing to worry about to test your case. You can use the same tool to see the HTML message in your case. It is very easy to set up, and you can try it in the "playground" project in the repository.

Here are the actions I will take:

  • I will create an issue regarding the HTML responses causing exceptions in the SDK. (Return an error message when OpenAI returns an HTML error instead of JSON. #447)
  • I will convert this issue into a discussion.
  • I will investigate further if someone can provide details about the HTML response they encountered. Otherwise, there is no point in investing more time into this issue. However, feel free to discuss it in discussion.

@betalgo betalgo locked and limited conversation to collaborators Dec 7, 2023
@kayhantolga kayhantolga converted this issue into discussion #448 Dec 7, 2023

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

No branches or pull requests