Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Value error, Please set your API key in config2.yaml [type=value_error, input_value='YOUR_API_KEY', input_type=str] #1671

Open
gokulcoder7 opened this issue Jan 17, 2025 · 2 comments

Comments

@gokulcoder7
Copy link

Bug description

Bug solved method

Environment information

  • LLM type and model name: ollama qwen 2.5 coder 14b
  • System version: windows 11
  • Python version:Python 3.9.21
  • MetaGPT version or branch:
  • packages version:
  • installation method:

llm:
api_type: "ollama" # or azure / ollama / open_llm etc. Check LLMType for more options
model: "qwen2.5-coder:14b" # or gpt-3.5-turbo-1106 / gpt-4-1106-preview
base_url: "http://localhost:11434" # or forward url / other llm url
api_key: "YOUR_API_KEY"

Screenshots or logs

Image

@gokulcoder7
Copy link
Author

how to find api key for ollama models ?

@iorisa
Copy link
Collaborator

iorisa commented Jan 18, 2025

For the configuration of ollama's config2.yaml, you can refer to this document: ollama-api
If it is a local ollama model, you can fill in any string, but it cannot be an empty string, left unfilled, or YOUR_API_KEY.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants