Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GPT-4-Turbo/Vision + Updated GPT-3.5-Turbo models #406

Merged
merged 10 commits into from
Nov 10, 2023

Conversation

ChaseIngersol
Copy link
Contributor

No description provided.

@ChaseIngersol ChaseIngersol changed the base branch from master to dev November 6, 2023 21:42
@tylerje
Copy link

tylerje commented Nov 6, 2023

In addition to the models, I'd love to see the "response_format" and "seed" properties added to the ChatCompletionCreateRequest class in order to use these new features. That may require some conditional code to allow it only for those models that support it if that is the case.
https://platform.openai.com/docs/api-reference/chat/create#chat-create-response_format

@ChaseIngersol
Copy link
Contributor Author

I've added the response_format and seed properties to ChatCompletionCreateRequest but will need a bit of time to look into which models support these, as I can't find a clear answer yet in the docs. May require some testing on my part.

@tylerje
Copy link

tylerje commented Nov 7, 2023

I've added the response_format and seed properties to ChatCompletionCreateRequest but will need a bit of time to look into which models support these, as I can't find a clear answer yet in the docs. May require some testing on my part.

Cool. I'll do some digging to see if I can help.

@ChaseIngersol
Copy link
Contributor Author

ChaseIngersol commented Nov 7, 2023

Looks like response_format is only compatible with Gpt_4_1106_preview and Gpt_3_5_Turbo_1106 from my testing. seed however is compatible with all GPT models.

Incompatible models return an Invalid parameter: 'response_format' of type 'json_object' is not supported with this model. message as the response.

Is conditional logic still necessary in this case? If so, would the preferred approach be to throw an exception informing the user that the model is incompatible with this property?

I've also added a ResponseFormat class for convenience so that the usage would be like so:

var completionTask = openAiService.ChatCompletion.CreateCompletion(new ChatCompletionCreateRequest {
	Messages = new List<ChatMessage> {
		ChatMessage.FromSystem(systemMessage),
		ChatMessage.FromUser(prompt)
	},
	Model = Models.Gpt_4_1106_preview,
	Temperature = 0.8f,
	ResponseFormat = new ResponseFormat { Type = "json_object" }
}, null, cancellationToken);

@zxrohex
Copy link

zxrohex commented Nov 7, 2023

Looks like response_format is only compatible with Gpt_4_1106_preview and Gpt_3_5_Turbo_1106 from my testing. seed however is compatible with all GPT models.

Incompatible models return an Invalid parameter: 'response_format' of type 'json_object' is not supported with this model. message as the response.

Is conditional logic still necessary in this case? If so, would the preferred approach be to throw an exception informing the user that the model is incompatible with this property?

I've also added a ResponseFormat class for convenience so that the usage would be like so:

var completionTask = openAiService.ChatCompletion.CreateCompletion(new ChatCompletionCreateRequest {
	Messages = new List<ChatMessage> {
		ChatMessage.FromSystem(systemMessage),
		ChatMessage.FromUser(prompt)
	},
	Model = Models.Gpt_4_1106_preview,
	Temperature = 0.8f,
	ResponseFormat = new ResponseFormat { Type = "json_object" }
}, null, cancellationToken);

I mean, the FunctionCall property in ChatMessage is always defined too and throws a exception if it's null, which is literally all of the time except when GPT is calling a function, which is the property's sole reason for existing, so I don't see why doing it in an different way, the library mainly uses defined properties that throw an exception if it's not needed unless the sole existence of the property makes difficulties

@tylerje
Copy link

tylerje commented Nov 8, 2023

ResponseFormat { Type = "json_object" }

Why not make this an enum with two values:

Unspecified (default)
JsonObject

My reasoning is there is likely to be additional format types in the future, and it avoids the magic string proliferation.

@kayhantolga
Copy link
Member

I made some changes, if everyone is happy I will merge this tomorrow.

@kayhantolga kayhantolga added the Code Review Needed Please Code Review If you have time label Nov 9, 2023
@kayhantolga kayhantolga added this to the 7.4.0 milestone Nov 9, 2023
@kayhantolga
Copy link
Member

I messed up a bit with resolving merge conflicts (actually, I blame Visual Studio). Please let me know if you see something odd.

@kayhantolga kayhantolga merged commit ea72ed8 into betalgo:dev Nov 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Code Review Needed Please Code Review If you have time
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants