Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Error handling #343

Closed
ZeroDeng01 opened this issue Jun 6, 2023 · 4 comments · Fixed by #345
Closed

About Error handling #343

ZeroDeng01 opened this issue Jun 6, 2023 · 4 comments · Fixed by #345

Comments

@ZeroDeng01
Copy link
Contributor

ZeroDeng01 commented Jun 6, 2023

Hi, when I am using Azure openai chatgpt, I refer to the example of Error handling and cannot catch the error code 429. Every time the 429 code appears, the program will run into the default code block. I am not sure whether it is a problem with my operation or The content returned by azure openai is different from that of openai.
go-openai failed to serialize error messages correctly.THANKS!

func OpenaiErrorAndCode(openaierr error) (code int, err error) {
	var openaiError = &openai.APIError{}
	if errors.As(openaierr, &openaiError) {
		switch openaiError.HTTPStatusCode {
		case 400:
			return 400, errors.New("xxxxxxx")
		case 401:
			return 401, errors.New("xxxxxxx")
		case 429:
			return 429, errors.New("xxxxxxx")
		case 500:
			return 500, errors.New("xxxxxxx")
		default:
				return 500, errors.New("unknown:" + openaierr.Error())
		}
	}
	return 500, openaierr
}
stream, err := client.CreateChatCompletionStream(ctx, req)
if err != nil {
	fmt.Printf("ChatCompletionStream error: %v\n", err)
	_,openaierr := appError.OpenaiErrorAndCode(err)
	fmt.Printf(openaierr.Error())
}

The json returned by azure openai is as follows:

{
    "error": {
        "code": "429",
        "message": "Requests to the Creates a completion for the chat message Operation under Azure OpenAI API version 2023-03-15-preview have exceeded token rate limit of your current OpenAI S0 pricing tier. Please retry after 20 seconds. Please go here: https://aka.ms/oai/quotaincrease if you would like to further increase the default rate limit."
    }
}
@vvatanabe
Copy link
Collaborator

@ZeroDeng01 I want you to use curl to log output the details (-v or --verbose)
Note: if you attach logs, please hide the secret.

@ZeroDeng01
Copy link
Contributor Author

@vvatanabe hello,The following is the log of my request. I have modified the key, ip, domain name, etc. to prevent the loss of key and other information,Hope this helps you debug related issues.THANKS!

curl https://myhost.openai.azure.com/openai/deployments/gpt-35-turbo0301/chat/completions?api-version=2023-03-15-preview \
>   -H "Content-Type: application/json" \
>   -H "api-key: my azure key" \
-d '{">   -d '{"messages":[{"role":"system","content":"You are an AI assistant that helps people find information."},{"role":"user","content":"hello"}],
>   "max_tokens":800,
>   "temperature":0.7,
>   "frequency_penalty":0,
>   "presence_penalty":0,
>   "top_p":0.95,
>   "stop":null,
>   "stream":true
> }' -v
*   Trying 20.61.x.x:443...
* TCP_NODELAY set
* Connected to myhost.openai.azure.com (20.61.x.x) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* successfully set certificate verify locations:
*   CAfile: /etc/ssl/certs/ca-certificates.crt
  CApath: /etc/ssl/certs
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
* TLSv1.3 (IN), TLS handshake, Server hello (2):
* TLSv1.2 (IN), TLS handshake, Certificate (11):
* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
* TLSv1.2 (IN), TLS handshake, Server finished (14):
* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
* TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1):
* TLSv1.2 (OUT), TLS handshake, Finished (20):
* TLSv1.2 (IN), TLS handshake, Finished (20):
* SSL connection using TLSv1.2 / ECDHE-RSA-AES256-GCM-SHA384
* ALPN, server accepted to use h2
* Server certificate:
*  subject: C=US; ST=WA; L=Redmond; O=Microsoft Corporation; CN=westeurope.api.cognitive.microsoft.com
*  start date: Mar 24 11:39:30 2023 GMT
*  expire date: Mar 18 11:39:30 2024 GMT
*  subjectAltName: host "myhost.openai.azure.com" matched cert's "*.openai.azure.com"
*  issuer: C=US; O=Microsoft Corporation; CN=Microsoft Azure TLS Issuing CA 01
*  SSL certificate verify ok.
* Using HTTP2, server supports multi-use
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
* Using Stream ID: 1 (easy handle 0x56102dd037c0)
> POST /openai/deployments/gpt-35-turbo0301/chat/completions?api-version=2023-03-15-preview HTTP/2
> Host: myhost.openai.azure.com
> user-agent: curl/7.68.0
> accept: */*
> content-type: application/json
> api-key: my azure key
> content-length: 277
>
* Connection state changed (MAX_CONCURRENT_STREAMS == 20)!
* We are completely uploaded and fine
< HTTP/2 429
< content-length: 368
< content-type: application/json
< retry-after: 17
< apim-request-id: 8282fe77-229a-4d8b-8ebb-8781dca6dde2
< strict-transport-security: max-age=31536000; includeSubDomains; preload
< x-content-type-options: nosniff
< policy-id: DeploymentRatelimit-Token
< x-ms-region: West Europe
< date: Wed, 07 Jun 2023 15:24:59 GMT
<
* Connection #0 to host myhost.openai.azure.com left intact
{"error":{"code":"429","message": "Requests to the Creates a completion for the chat message Operation under Azure OpenAI API version 2023-03-15-preview have exceeded token rate limit of your current OpenAI S0 pricing tier. Please retry after 17 seconds. Please go here: https://aka.ms/oai/quotaincrease if you would like to further increase the default rate limit."}}

@vvatanabe
Copy link
Collaborator

@ZeroDeng01 I made a pull request to fix this problem.
#345

@ZeroDeng01
Copy link
Contributor Author

@vvatanabe You’ve been a big help. I just saw your pull and I feel like it will help me. After the pull request is approved and a new version is released, I will experience and test it. Thank you again!😁

sashabaranov pushed a commit that referenced this issue Jun 8, 2023
* fix json marshaling error response of azure openai (#343)

* add a test case for handleErrorResp func (#343)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants