We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I am testing Torch 2.0.1, 2.1 and 1.13.1
The speed changes but the results are also changing
For example 2.0.1 is slower but it ends quicker number of steps and produces better output
The text was updated successfully, but these errors were encountered:
Any news?
Sorry, something went wrong.
i made my own version with batch captioning working great
https://www.patreon.com/posts/sota-image-for-2-90744385
captioners_clip_interrogator_v2.zip LLaVA_auto_install_v3.zip Qwen-VL_v3.zip blip2_captioning_v1.zip CogVLM_v7.zip Kosmos-2_v5.zip
I'm happy that you got results, but for me it doesn't make sense to pay to test/use
No branches or pull requests
I am testing Torch 2.0.1, 2.1 and 1.13.1
The speed changes but the results are also changing
For example 2.0.1 is slower but it ends quicker number of steps and produces better output
The text was updated successfully, but these errors were encountered: