-
-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move default model to gpt-4o-mini #91
Move default model to gpt-4o-mini #91
Conversation
@PeterDaveHello Thank you 😄 |
@@ -83,7 +83,7 @@ aicommit config set <key>=<value> | |||
| `OPENAI_KEY` | N/A | The OpenAI API key. | | |||
| `locale` | `en` | Locale for the generated commit messages. | | |||
| `generate` | `1` | Number of commit messages to generate. | | |||
| `model` | `gpt-3.5-turbo` | The Chat Completions model to use. | | |||
| `model` | `gpt-4o-mini` | The Chat Completions model to use. | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@PeterDaveHello I think this model should be gpt-4o
according to the current supported models
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand. It's quite clear:
As of July 2024, gpt-4o-mini should be used in place of gpt-3.5-turbo, as it is cheaper, more capable, multimodal, and just as fast. gpt-3.5-turbo is still available for use in the API.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I completely agree with you - and i wish i could use the -mini
, but as far as i can see it's only the -4o
that is supported right now in their dictionary ?
Let me know if i'm misunderstanding something :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there any dependency on the repo you mentioned? I don't know why it's relevant.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes - it's dependant on OpenAI's TikToken repo here: https://github.com/openai/tiktoken/blob/c0ba74c238d18b4824c25f3c27fc8698055b9a76/tiktoken/model.py#L22
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe this is what you mean:
Line 32 in cad5e79
"@dqbd/tiktoken": "^1.0.2" |
Then we should wait for dqbd/tiktoken#112. Otherwise, I don't understand why you asked me to open a PR here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are absolutely right 👍🏼
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe using the official openai-node
lib will prevent similar issues in the feature because they already supported gpt-4o-mini
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@PeterDaveHello Good point. I will remove the dependency as it gives no value 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you update your branch now - i think it should work :)
@negoziator as you requested 😏