Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] - Generating recipe from image failing using Ollama, but debug succeeding #4404

Open
5 of 6 tasks
ChilliGeologist opened this issue Oct 21, 2024 · 6 comments
Open
5 of 6 tasks
Labels
bug Something isn't working triage

Comments

@ChilliGeologist
Copy link

ChilliGeologist commented Oct 21, 2024

First Check

  • This is not a feature request.
  • I added a very descriptive title to this issue (title field is above this).
  • I used the GitHub search to find a similar issue and didn't find it.
  • I searched the Mealie documentation, with the integrated search.
  • I already read the docs and didn't find an answer.
  • This issue can be replicated on the demo site (https://demo.mealie.io/).

What is the issue you are experiencing?

When attempting to generate recipes from an image I am receiving Internal Server Error 500. I am using Ollama as my OpenAI backend however the OpenAI debug tests work successfully. Here is the image I am testing with:

https://marketplace.canva.com/EADaoY_qT7Y/1/0/1067w/canva-brown-cream-cookies-general-recipe-card-6fHd-Q-t6oE.jpg

Steps to Reproduce

  1. Go to Settings -> Debug -> OpenAI
  2. Upload sample image and hit "Run Test"
  3. Note valid response and no errors
  4. Go to Create Recipe -> Create from image
  5. Upload same image, and click Create
  6. Creation fails

Please provide relevant logs

When using the generate recipe from image feature, I am receiving errors in my log as follows:

INFO 2024-10-21T14:11:37 - HTTP Request: POST http://ollama:11434/v1/chat/completions "HTTP/1.1 500 Internal Server Error" INFO 2024-10-21T14:11:37 - Retrying request to /chat/completions in 0.452659 seconds INFO 2024-10-21T14:11:47 - HTTP Request: POST http://ollama:11434/v1/chat/completions "HTTP/1.1 500 Internal Server Error" INFO 2024-10-21T14:11:48 - Retrying request to /chat/completions in 0.897258 seconds INFO 2024-10-21T14:11:58 - HTTP Request: POST http://ollama:11434/v1/chat/completions "HTTP/1.1 500 Internal Server Error" ERROR 2024-10-21T14:11:58 - Exception in ASGI application Traceback (most recent call last): File "/app/mealie/services/openai/openai.py", line 177, in get_response response = await self._get_raw_response(prompt, user_messages, temperature, force_json_response) File "/app/mealie/services/openai/openai.py", line 142, in _get_raw_response return await client.chat.completions.create( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1633, in create return await self._post( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1838, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1532, in request return await self._request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1618, in _request return await self._retry_request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1665, in _retry_request return await self._request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1618, in _request return await self._retry_request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1665, in _retry_request return await self._request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1633, in _request raise self._make_status_error_from_response(err.response) from None openai.InternalServerError: Error code: 500 - {'error': {'message': 'an unknown error was encountered while running the model ', 'type': 'api_error', 'param': None, 'code': None}} The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/app/mealie/services/recipe/recipe_service.py", line 491, in build_recipe_from_images response = await openai_service.get_response( File "/app/mealie/services/openai/openai.py", line 182, in get_response raise Exception(f"OpenAI Request Failed. {e.__class__.__name__}: {e}") from e Exception: OpenAI Request Failed. InternalServerError: Error code: 500 - {'error': {'message': 'an unknown error was encountered while running the model ', 'type': 'api_error', 'param': None, 'code': None}} The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi result = await app( # type: ignore[func-returns-value] File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__ return await self.app(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__ await super().__call__(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__ await self.middleware_stack(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__ raise exc File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__ await self.app(scope, receive, _send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/sessions.py", line 85, in __call__ await self.app(scope, receive, send_wrapper) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 24, in __call__ await responder(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 44, in __call__ await self.app(scope, receive, self.send_with_gzip) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__ await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__ await self.middleware_stack(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 776, in app await route.handle(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle await self.app(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 72, in app response = await func(request) File "/app/mealie/routes/_base/routers.py", line 35, in custom_route_handler response = await original_route_handler(request) File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 301, in app raw_response = await run_endpoint_function( File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 212, in run_endpoint_function return await dependant.call(**values) File "/app/mealie/routes/recipe/recipe_crud_routes.py", line 322, in create_recipe_from_image recipe = await self.service.create_from_images(images, translate_language) File "/app/mealie/services/recipe/recipe_service.py", line 297, in create_from_images recipe_data = await openai_recipe_service.build_recipe_from_images( File "/app/mealie/services/recipe/recipe_service.py", line 495, in build_recipe_from_images raise Exception("Failed to call OpenAI services") from e Exception: Failed to call OpenAI services ERROR 2024-10-21T14:11:58 - Exception in ASGI application Traceback (most recent call last): File "/app/mealie/services/openai/openai.py", line 177, in get_response response = await self._get_raw_response(prompt, user_messages, temperature, force_json_response) File "/app/mealie/services/openai/openai.py", line 142, in _get_raw_response return await client.chat.completions.create( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1633, in create return await self._post( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1838, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1532, in request return await self._request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1618, in _request return await self._retry_request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1665, in _retry_request return await self._request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1618, in _request return await self._retry_request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1665, in _retry_request return await self._request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1633, in _request raise self._make_status_error_from_response(err.response) from None openai.InternalServerError: Error code: 500 - {'error': {'message': 'an unknown error was encountered while running the model ', 'type': 'api_error', 'param': None, 'code': None}} The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/app/mealie/services/recipe/recipe_service.py", line 491, in build_recipe_from_images response = await openai_service.get_response( File "/app/mealie/services/openai/openai.py", line 182, in get_response raise Exception(f"OpenAI Request Failed. {e.__class__.__name__}: {e}") from e Exception: OpenAI Request Failed. InternalServerError: Error code: 500 - {'error': {'message': 'an unknown error was encountered while running the model ', 'type': 'api_error', 'param': None, 'code': None}} The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi result = await app( # type: ignore[func-returns-value] File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__ return await self.app(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__ await super().__call__(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__ await self.middleware_stack(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__ raise exc File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__ await self.app(scope, receive, _send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/sessions.py", line 85, in __call__ await self.app(scope, receive, send_wrapper) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 24, in __call__ await responder(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 44, in __call__ await self.app(scope, receive, self.send_with_gzip) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__ await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__ await self.middleware_stack(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 776, in app await route.handle(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle await self.app(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 72, in app response = await func(request) File "/app/mealie/routes/_base/routers.py", line 35, in custom_route_handler response = await original_route_handler(request) File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 301, in app raw_response = await run_endpoint_function( File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 212, in run_endpoint_function return await dependant.call(**values) File "/app/mealie/routes/recipe/recipe_crud_routes.py", line 322, in create_recipe_from_image recipe = await self.service.create_from_images(images, translate_language) File "/app/mealie/services/recipe/recipe_service.py", line 297, in create_from_images recipe_data = await openai_recipe_service.build_recipe_from_images( File "/app/mealie/services/recipe/recipe_service.py", line 495, in build_recipe_from_images raise Exception("Failed to call OpenAI services") from e Exception: Failed to call OpenAI services

However, when running the debug OpenAI tool, using the exact same image to test with, I receive no errors in my log:

INFO 2024-10-21T14:05:31 - HTTP Request: POST http://ollama:11434/v1/chat/completions "HTTP/1.1 200 OK"

Mealie Version

Mealie Build: b86c01e

Deployment

Unraid

Additional Deployment Details

No response

@ChilliGeologist ChilliGeologist added bug Something isn't working triage labels Oct 21, 2024
@michael-genson
Copy link
Collaborator

an unknown error was encountered while running the model

Users that have tried using ollama have reported after checking their logs that the model can't support a large enough context window for Mealie. When we send the image we also send a prompt to instruct the LLM on how to parse the image. Can you check your ollama logs and see what you find?

@ChilliGeologist
Copy link
Author

Looking at Ollama I see the following:

time=2024-10-21T05:20:59.890Zlevel=WARNsource=sched.go:137msg=multimodal models don't support parallel requests yet time=2024-10-21T05:20:59.892Zlevel=WARNsource=server.go:518msg=llama runner process no longer runningsys=11string=signal: segmentation fault time=2024-10-21T05:21:05.000Zlevel=WARNsource=sched.go:646msg=gpu VRAM usage didn't recover within timeoutseconds=5.108732829model=/root/.ollama/models/blobs/sha256-170370233dd5c5415250a2ecd5c71586352850729062ccef1496385647293868

So I guess it's just a VRAM issue? I'm incredibly new to this stuff so I don't fully understand what is meant by context window.

@michael-genson
Copy link
Collaborator

All good, I think it's new for most people!

The context window has to do with the model; you can see more in the docs: https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size

However yeah, if you don't have enough VRAM the model will be slow and/or fail entirely. The amount of required VRAM depends on the model you're using

@ChilliGeologist
Copy link
Author

ChilliGeologist commented Oct 21, 2024

Thanks for that information! I'll play around with other models and see if I can get it going.

Out of curiosity, is there anything Mealie is doing that is requiring more VRAM? As I'm also using this same model and Ollama instance to analyse pictures using other tools without issue, not to mention the fact the debug in Mealie works fine and provides an accurate description of the test image. This might help me in my search for a model that works for me.

@michael-genson
Copy link
Collaborator

VRAM no, but the prompt is pretty big (we include a large prompt to help improve accuracy), which is where the context window comes into play.

We're effectively using the model to take unstructured data (the image) and transform it into structured data that Mealie can understand

@michael-genson
Copy link
Collaborator

This is the prompt: https://github.com/mealie-recipes/mealie/blob/mealie-next/mealie%2Fservices%2Fopenai%2Fprompts%2Frecipes%2Fparse-recipe-image.txt

And we also inject the recipe schema as JSON into the prompt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage
Projects
None yet
Development

No branches or pull requests

2 participants