-
Notifications
You must be signed in to change notification settings - Fork 5.2k
Open
Labels
Good First issueGood for newcomersGood for newcomers
Description
Describe the bug
After running
interpreter --model i
without any previous configuration, the following error is raised (after some input is sent):
> wheres my chrime_analysis directory
16:16:08 - LiteLLM:ERROR: utils.py:1953 - Model not found or error in checking vision support. You passed model=i, custom_llm_provider=openai. Error: This model isn't mapped yet. model=i, custom_llm_provider=openai. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.Reproduce
1.pip install open-interpreter
2. input at cmd: interpreter --model i
3. input up some question, like where is a random directory
Expected behavior
It should work sucesscully, without any errors and without the need of the configuration (as it would run with your hosted LLM)
Screenshots
Open Interpreter version
0.4.3
Python version
3.11.9
Operating System name and version
Windows 11
Additional context
No response
Metadata
Metadata
Assignees
Labels
Good First issueGood for newcomersGood for newcomers
