Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llm chat errors on followup since 0.6 #601

Closed
yorickvP opened this issue Oct 31, 2024 · 5 comments
Closed

llm chat errors on followup since 0.6 #601

yorickvP opened this issue Oct 31, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@yorickvP
Copy link

$ llm chat
> hello
how are you?
> not much, how are you?
Traceback (most recent call last):
  File "/nix/store/bhbq5lpdd2ncpn7b12w8d5sl6x9xm4gy-python3.12-llm-0.17/bin/.llm-wrapped", line 9, in <module>
    sys.exit(cli())
             ^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/llm/cli.py", line 535, in chat
    for chunk in response:
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/llm/models.py", line 169, in __iter__
    for chunk in self.model.execute(
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/llm_claude_3.py", line 159, in execute
    "messages": self.build_messages(prompt, conversation),
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/llm_claude_3.py", line 110, in build_messages
    if response.attachments:
       ^^^^^^^^^^^^^^^^^^^^
AttributeError: 'Response' object has no attribute 'attachments'
@simonw
Copy link
Owner

simonw commented Nov 1, 2024

Thanks - this is a bug in LLM core, moving it there.

@simonw simonw transferred this issue from simonw/llm-claude-3 Nov 1, 2024
@simonw simonw added the bug Something isn't working label Nov 1, 2024
@simonw
Copy link
Owner

simonw commented Nov 1, 2024

This happens with Claude 3.5 Sonnet and GPT-4o, so I think it's a bug in LLM.

@simonw
Copy link
Owner

simonw commented Nov 1, 2024

I'm not sure why this test didn't catch this:

llm/tests/test_chat.py

Lines 20 to 31 in 122265a

@pytest.mark.xfail(sys.platform == "win32", reason="Expected to fail on Windows")
def test_chat_basic(mock_model, logs_db):
runner = CliRunner()
mock_model.enqueue(["one world"])
mock_model.enqueue(["one again"])
result = runner.invoke(
llm.cli.cli,
["chat", "-m", "mock"],
input="Hi\nHi two\nquit\n",
catch_exceptions=False,
)
assert result.exit_code == 0

I've manually tested the fix and confirmed that it works. I'll need to rethink this when I add attachment support to llm chat directly.

@simonw
Copy link
Owner

simonw commented Nov 1, 2024

Manual testing:

% llm chat -m gpt-4o-mini
Chatting with gpt-4o-mini
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> A short pun about pelicans
Why don’t pelicans ever get invited to parties? Because they always bring their own bag!
> banana slugs
What do banana slugs wear to impress each other? Slug-phy attire! 🍌🐌

And llm logs -c now returns:

2024-11-01T21:18:36 conversation: 01jbmsg3f181dntajvanx0ep5g

Model: gpt-4o-mini

Prompt:

A short pun about pelicans

Response:

Why don’t pelicans ever get invited to parties? Because they always bring their own bag!

2024-11-01T21:18:41

Prompt:

banana slugs

Response:

What do banana slugs wear to impress each other? Slug-phy attire! 🍌🐌

simonw added a commit that referenced this issue Nov 1, 2024
@maxwelljoslyn
Copy link

maxwelljoslyn commented Nov 1, 2024

This seems to be a duplicate of #597 so I'll close that too -- glad to see it fixed, thanks Simon!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants