Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: support groq #346

Open
hacker-hackman opened this issue May 22, 2024 · 4 comments
Open

[Feature]: support groq #346

hacker-hackman opened this issue May 22, 2024 · 4 comments

Comments

@hacker-hackman
Copy link

Description

Add support for API at groq.com.

Suggested Solution

groq api is compatible with openai api so basically no additional functionality is required.

Alternatives

I achieved groq support by opencommit just by removing corresponding checks for API key name and model name in cli.cjs:

["OCO_OPENAI_API_KEY" /* OCO_OPENAI_API_KEY */](value, config9 = {}) {
  validateConfig(
    "OpenAI API_KEY",
    value || config9.OCO_ANTHROPIC_API_KEY || config9.OCO_AI_PROVIDER == "ollama" || config9.OCO_AI_PROVIDER == "test",
    "You need to provide an OpenAI/Anthropic API key"
  );
  // Removed the check for the key starting with "sk-"
  return value;
},
["OCO_MODEL" /* OCO_MODEL */](value) {
  // Removed the validation that checks if the model name is in the predefined list
  return value;
},
var MODEL = config3?.OCO_MODEL || "gpt-3.5-turbo";
// Removed the validation that checks if the model name is in the predefined list

my .opencommit config:

OCO_OPENAI_API_KEY=MY_SECRET_GROQ_KEY
OCO_ANTHROPIC_API_KEY=undefined
OCO_TOKENS_MAX_INPUT=undefined
OCO_TOKENS_MAX_OUTPUT=undefined
OCO_OPENAI_BASE_PATH=https://api.groq.com/openai/v1
OCO_DESCRIPTION=false
OCO_EMOJI=false
OCO_MODEL=llama3-70b-8192
OCO_LANGUAGE=en
OCO_MESSAGE_TEMPLATE_PLACEHOLDER=$msg
OCO_PROMPT_MODULE=conventional-commit
OCO_AI_PROVIDER=openai
OCO_GITPUSH=false
OCO_ONE_LINE_COMMIT=false

Additional Context

No response

@di-sukharev
Copy link
Owner

we support openAI and Anthropic and other providers like local ollama, you can make groq work without hacks by adding a new provider similar to what we have for Anthropic support

@Nidesh-Arumugam
Copy link

@di-sukharev can you elaborate this ? I am kinda stuck in the same situation and looking for a way around.

@di-sukharev
Copy link
Owner

i mean instead of removing the piece of config — just make a check for the groq config and add a model name to the predefined list

those config validators help you configure correct values, we should not remove them

@ozeliurs
Copy link
Contributor

ozeliurs commented Sep 7, 2024

Hey @hacker-hackman @Nidesh-Arumugam, groq available in latest release !!

Probably should close this @di-sukharev.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants