You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When "llama_local" is set as the modelProvider in the character.json file, a prompt returns this error:
Error fetching response: SyntaxError: Unexpected token 'A', "Agent not found" is not valid JSON
at JSON.parse (<anonymous>)
at parseJSONFromBytes (node:internal/deps/undici/undici:5731:19)
at successSteps (node:internal/deps/undici/undici:5712:27)
at fullyReadBody (node:internal/deps/undici/undici:4609:9)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async consumeBody (node:internal/deps/undici/undici:5721:7)
at async handleUserInput (file:///home/cipher/cipher/agent/src/index.ts:454:22)
at async file:///home/cipher/cipher/agent/src/index.ts:420:13
To Reproduce
In any character.json file, set the modelProvider to "llama_local"
Initialize your character
Send a prompt, such as "hello"
Expected behavior
I expected the character to respond back with a greeting such as hi, hello, how are you, etc.
Describe the bug
When "llama_local" is set as the modelProvider in the character.json file, a prompt returns this error:
To Reproduce
Expected behavior
I expected the character to respond back with a greeting such as hi, hello, how are you, etc.
Screenshots
Additional context
Using WSL with release: https://github.com/ai16z/eliza/releases/tag/v0.1.5-alpha.5
The text was updated successfully, but these errors were encountered: