-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Closed
Labels
Description
Is there an existing issue for this?
- I have searched the existing issues
OS
Linux
GPU
cuda
VRAM
12GB
What version did you experience this issue on?
What happened?
sdxl/main/SDXL base 1_0:
path: sdxl/main/SDXL base 1_0
description: SDXL base v1.0
vae: sdxl/vae/sdxl-vae-fp16-fix/
variant: normal
format: diffusersNote the presence of a vae override and the use of a relative path there.
Traceback: ModelNotFoundException
[2023-08-01 16:43:47,222]::[InvokeAI]::ERROR --> Traceback (most recent call last):
File "src/InvokeAI/invokeai/app/services/processor.py", line 86, in __process
outputs = invocation.invoke(
^^^^^^^^^^^^^^^^^^
File "lib/python3.11/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "src/InvokeAI/invokeai/app/invocations/latent.py", line 515, in invoke
vae_info = context.services.model_manager.get_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "src/InvokeAI/invokeai/app/services/model_manager_service.py", line 364, in get_model
model_info = self.mgr.get_model(
^^^^^^^^^^^^^^^^^^^
File "src/InvokeAI/invokeai/backend/model_management/model_manager.py", line 484, in get_model
model_path = model_class.convert_if_required(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "src/InvokeAI/invokeai/backend/model_management/models/vae.py", line 103, in convert_if_required
if cls.detect_format(model_path) == VaeModelFormat.Checkpoint:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "src/InvokeAI/invokeai/backend/model_management/models/vae.py", line 83, in detect_format
raise ModelNotFoundException()
invokeai.backend.model_management.models.base.ModelNotFoundException
[2023-08-01 16:43:47,226]::[InvokeAI]::ERROR --> Error while invoking: