Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do you run inferences outside of the webui like a script? #65

Open
nub2927 opened this issue Apr 2, 2023 · 1 comment
Open

How do you run inferences outside of the webui like a script? #65

nub2927 opened this issue Apr 2, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@nub2927
Copy link

nub2927 commented Apr 2, 2023

my use case for tensorrt is I need to make a lot of regularization images,
however the way im constructing the regularization images is im drawing from a pool of unique prompts that also correlate to uniqure image sizes necessitating the need for a script that draws from this pool and sets the prompt and images sizes at each inference,
using automatics webui I make use of the custom scripts features to do this but inferences were too slow there,
is there a way for me to make a script that calls the tensort inferences in this same manner?

@ddPn08
Copy link
Owner

ddPn08 commented Apr 30, 2023

A plugin API will be released soon.

@ddPn08 ddPn08 added the enhancement New feature or request label Apr 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants