Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support ollama.com structured-outputs Support ollama structured output. #1305

Merged
merged 6 commits into from
Dec 12, 2024

Conversation

willshang76
Copy link
Collaborator

Description

Now ollama supports structured output.

Motivation and Context

close #1290

  • I have raised an issue to propose this change (required for new features and bug fixes)

Types of changes

What types of changes does your code introduce? Put an x in all the boxes that apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds core functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation (update in the documentation)
  • Example (update in the folder of example)

Implemented Tasks

  • Subtask 1
  • Subtask 2
  • Subtask 3

Checklist

Go over all the following points, and put an x in all the boxes that apply.
If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide. (required)
  • My change requires a change to the documentation.
  • I have updated the tests accordingly. (required for a bug fix or a new feature)
  • I have updated the documentation accordingly.

@willshang76 willshang76 self-assigned this Dec 11, 2024
@willshang76 willshang76 added the Model Related to backend models label Dec 11, 2024
@willshang76 willshang76 added this to the Sprint 18 milestone Dec 11, 2024
Copy link
Collaborator

@MuggleJinx MuggleJinx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks good!

camel/models/base_model.py Outdated Show resolved Hide resolved
camel/models/base_model.py Outdated Show resolved Hide resolved
@MuggleJinx
Copy link
Collaborator

However, I failed to run the example... The model failed to produce a valid json format and the pydantic validation failed. Other reviewers can double check with this..

Copy link
Member

@Wendong-Fan Wendong-Fan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @willshang76 , seems currently the example code can't run as expected

Ollama server started on http://localhost:11434/v1 for llama3.2 model.
2024-12-11 21:53:27,666 - httpx - INFO - HTTP Request: POST http://localhost:11434/v1/chat/completions "HTTP/1.1 200 OK"
2024-12-11 21:53:27,674 - camel.models.model_manager - ERROR - Error processing with model: <camel.models.ollama_model.OllamaModel object at 0x10628fb60>
2024-12-11 21:53:27,674 - camel.agents.chat_agent - ERROR - An error occurred while running model llama3.2, index: 0
Traceback (most recent call last):
  File "/Users/enrei/Desktop/camel_1127/camel/camel/agents/chat_agent.py", line 965, in _step_model_response
    response = self.model_backend.run(openai_messages)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Desktop/camel_1127/camel/camel/models/model_manager.py", line 211, in run
    raise exc
  File "/Users/enrei/Desktop/camel_1127/camel/camel/models/model_manager.py", line 201, in run
    response = self.current_model.run(messages)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Desktop/camel_1127/camel/camel/models/ollama_model.py", line 142, in run
    response = self._client.beta.chat.completions.parse(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/resources/beta/chat/completions.py", line 156, in parse
    return self._post(
           ^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1280, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 957, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1063, in _request
    return self._process_response(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1162, in _process_response
    return api_response.parse()
           ^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/_response.py", line 319, in parse
    parsed = self._options.post_parser(parsed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/resources/beta/chat/completions.py", line 150, in parser
    return _parse_chat_completion(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/lib/_parsing/_completions.py", line 110, in parse_chat_completion
    "parsed": maybe_parse_content(
              ^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/lib/_parsing/_completions.py", line 161, in maybe_parse_content
    return _parse_content(response_format, message.content)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/lib/_parsing/_completions.py", line 221, in _parse_content
    return cast(ResponseFormatT, model_parse_json(response_format, content))
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/openai/_compat.py", line 169, in model_parse_json
    return model.model_validate_json(data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Library/Caches/pypoetry/virtualenvs/camel-ai-gf_ewcJG-py3.12/lib/python3.12/site-packages/pydantic/main.py", line 625, in model_validate_json
    return cls.__pydantic_validator__.validate_json(json_data, strict=strict, context=context)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pydantic_core._pydantic_core.ValidationError: 1 validation error for PetList
  Invalid JSON: expected ident at line 1 column 2 [type=json_invalid, input_value="It sounds like you have ...ties in your household?", input_type=str]
    For further information visit https://errors.pydantic.dev/2.9/v/json_invalid
Traceback (most recent call last):
  File "/Users/enrei/Desktop/camel_1127/camel/examples/models/ollama_model_example.py", line 83, in <module>
    assistant_response = agent.step(user_msg)
                         ^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Desktop/camel_1127/camel/camel/agents/chat_agent.py", line 540, in step
    ) = self._step_model_response(openai_messages, num_tokens)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enrei/Desktop/camel_1127/camel/camel/agents/chat_agent.py", line 976, in _step_model_response
    raise ModelProcessingError(
camel.models.model_manager.ModelProcessingError: Unable to process messages: none of the provided models run succesfully.
(camel-ai-py3.12) 

@willshang76
Copy link
Collaborator Author

willshang76 commented Dec 11, 2024

Based on the allama doc(https://ollama.com/blog/structured-outputs), you need to install the latest version of allama to make it work.

@Wendong-Fan
Copy link
Member

Thanks @willshang76 !

@Wendong-Fan Wendong-Fan merged commit ed92297 into master Dec 12, 2024
6 checks passed
@Wendong-Fan Wendong-Fan deleted the ollama-structured-output branch December 12, 2024 08:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Model Related to backend models
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants