-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
implement groq completions #2
Conversation
First, thanks so much for this PR. The only thing I'm not a fan of right now is losing the "open" matching for the models that have clear prefixes. For the from_mod, could we make it so that only Grok has the fixed name match, and the others remain as before? If you do not have time to make the change, I will do it, no problem. Having the fixed names for the list_models is okay for now. That will change later as I would like to have them query when the API is available. Note: by design, the list_models is decoupled from the from_models, as the first one is more for user info and tries to be as live as possible, and the second is more for a wider match focused on in-memory/static processing. |
I moved back to the prefix solution :)
ah makes sense |
Perfect, I will merge this. So Grok supports the OpenAI API to the letter? |
I think so, this is even their endpoint
I've tested groq and from what I remember it supports multiple system messages (for the llama models I tested at least) fine so no issue there hopefully |
Cool. I found this page: https://console.groq.com/docs/openai So it seems OpenAI API is their API strategy, so they should support it pretty well. I am going to add the temperature, max_token, … all in the ChatRequestOptions, so we will get that soon, and that will include Grok. |
@stargazing-dino Thanks, I just merged it. |
Btw, very clean code and PR. Thanks! |
Implements groq api. See their quickstart
Which for the most part follows the openAI schema.
One relatively larger change this PR made was to the
AdapterKind::from_model
.The models from groq do not have any common prefix or suffix to do partial matching on so I had to change it to just match the exact string. For that I made the models pub. I thought to use
Adapter::list_models
but it requiredkind
to be passed through which was unfeasible since we don't have one at that point and it's also async so it'd require some more changes than just making the models pub to the crate.I tested this locally and it seemed to work fine.