Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected token 'A', "Agent not found" is not valid JSON #1059

Closed
cipherkilledit opened this issue Dec 14, 2024 · 1 comment
Closed

Unexpected token 'A', "Agent not found" is not valid JSON #1059

cipherkilledit opened this issue Dec 14, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@cipherkilledit
Copy link

cipherkilledit commented Dec 14, 2024

Describe the bug

When "llama_local" is set as the modelProvider in the character.json file, a prompt returns this error:

Error fetching response: SyntaxError: Unexpected token 'A', "Agent not found" is not valid JSON
    at JSON.parse (<anonymous>)
    at parseJSONFromBytes (node:internal/deps/undici/undici:5731:19)
    at successSteps (node:internal/deps/undici/undici:5712:27)
    at fullyReadBody (node:internal/deps/undici/undici:4609:9)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async consumeBody (node:internal/deps/undici/undici:5721:7)
    at async handleUserInput (file:///home/cipher/cipher/agent/src/index.ts:454:22)
    at async file:///home/cipher/cipher/agent/src/index.ts:420:13

To Reproduce

  1. In any character.json file, set the modelProvider to "llama_local"
  2. Initialize your character
  3. Send a prompt, such as "hello"

Expected behavior

I expected the character to respond back with a greeting such as hi, hello, how are you, etc.

Screenshots

image

Additional context

Using WSL with release: https://github.com/ai16z/eliza/releases/tag/v0.1.5-alpha.5

@cipherkilledit cipherkilledit added the bug Something isn't working label Dec 14, 2024
@cipherkilledit
Copy link
Author

@lalalune what was the resolution...? Many users are still having issues using llama_local.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants