Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model loading from other sources. #34

Open
hukk06 opened this issue Feb 11, 2023 · 4 comments
Open

Model loading from other sources. #34

hukk06 opened this issue Feb 11, 2023 · 4 comments
Labels
enhancement New feature or request

Comments

@hukk06
Copy link

hukk06 commented Feb 11, 2023

Super satisfied with the speed, thanks for such an amazing tool.

for TensorRT engine building:
Is it possible to get a feature, that its possible to load the model from HDD instead of huggingface?
Or if its already possible, how?
Also, if the feature is being planned, is .safetensors extension support possible in the future?

Thank you for your consideration.

@ddPn08
Copy link
Owner

ddPn08 commented Feb 12, 2023

You can load your local diffusers model. For Docker, the folder must be mounted in advance.

@knot2006
Copy link

Any info how to actual do it?
image
Gets me this error:
image

I am using Local version ( not docker ).

@Stax124
Copy link
Contributor

Stax124 commented Feb 21, 2023

@knot2006 it needs to be converted into diffusers format first. As far as my understanding goes, this is not possible yet.

@fantasyz
Copy link

I really hope this can be done. I want to use my custom trained models directly from SSD without uploading to hugging face.

@ddPn08 ddPn08 added the enhancement New feature or request label Apr 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants