Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPU performance increase with xformers #695

Closed
AestheticMayhem opened this issue Apr 1, 2023 · 9 comments
Closed

GPU performance increase with xformers #695

AestheticMayhem opened this issue Apr 1, 2023 · 9 comments
Labels
enhancement New feature or request stale

Comments

@AestheticMayhem
Copy link

AestheticMayhem commented Apr 1, 2023

GPU performance increase with xformers

Is it possible to implement xformers for better performance as in stable diffusion, i only have a gtx 1060 with 6g vram and 16g ram, would be cool if it works, just a suggestion, i know nothing about coding

thank you for the good work

@AestheticMayhem AestheticMayhem added the enhancement New feature or request label Apr 1, 2023
@lolxdmainkaisemaanlu
Copy link

I have the exact same setup as your's. I use xformers and the medvram flag in stable diffusion and it helps me a lot. Would appreciate having it here too.

@AestheticMayhem
Copy link
Author

AestheticMayhem commented Apr 2, 2023

I have the exact same setup as your's. I use xformers and the medvram flag in stable diffusion and it helps me a lot. Would appreciate having it here too.

here is the link to xformers https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Xformers

There is a guide for installation that involves copying a repository into the main folder of the program. Additionally, a whl file must be copied into the same main folder and activated. I am currently attempting to follow this guide, and I believe it should work as xformers were designed primarily for NLP tasks. However, as a novice, I am unsure about where to place the files, and I would greatly appreciate any assistance.

I have asked GPT multiple questions, and it has consistently confirmed that it should be possible to utilize xformers if they are correctly loaded and have appropriate instructions in the main run file to enable their use.

I am attempting to locate the same bridge instruction in the stable diffusion code in order to include it in the .py file of Oobabooga. Based on my research, it appears that the instruction will resemble the transformer section and the tokenizer will load it for calculation onto the CUDA.

Establishing an official Oobabooga Discord channel would provide a beneficial platform for the community to collaborate and enhance the program's development, as it shows promise.

@QuantumAlignmentLabOnline

I have the exact same setup as your's. I use xformers and the medvram flag in stable diffusion and it helps me a lot. Would appreciate having it here too.

here is the link to xformers https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Xformers

There is a guide for installation that involves copying a repository into the main folder of the program. Additionally, a whl file must be copied into the same main folder and activated. I am currently attempting to follow this guide, and I believe it should work as xformers were designed primarily for NLP tasks. However, as a novice, I am unsure about where to place the files, and I would greatly appreciate any assistance.

I have asked GPT multiple questions, and it has consistently confirmed that it should be possible to utilize xformers if they are correctly loaded and have appropriate instructions in the main run file to enable their use.

I am attempting to locate the same bridge instruction in the stable diffusion code in order to include it in the .py file of Oobabooga. Based on my research, it appears that the instruction will resemble the transformer section and the tokenizer will load it for calculation onto the CUDA.

Establishing an official Oobabooga Discord channel would provide a beneficial platform for the community to collaborate and enhance the program's development, as it shows promise.

Any updates?

@knoopx
Copy link

knoopx commented Apr 4, 2023

related huggingface/transformers#22386

@AestheticMayhem
Copy link
Author

Any updates?

I attempted to duplicate the xformer dataset and copied it to both the main oobabooga folder and the modules folder for extension-related configuration loads. In addition, I installed the necessary dependencies for the xformer with PyTorch 2 acceleration and a CUDA toolkit GPU. I also added a command line to the webui.bat file to enforce loading when running the program. While it recognizes the xformer, it still generates an error. I believe that we need to include instructions in the main server exe .py file to establish a relationship and comprehension of the xformers' functions when running the program.

@AestheticMayhem
Copy link
Author

related huggingface/transformers#22386

thanks for sharing, i will get thru it

@AestheticMayhem
Copy link
Author

AestheticMayhem commented Apr 5, 2023

related huggingface/transformers#22386

thanks for sharing, i will get thru it

This is a promising start, as it appears to be functional and applicable. I came across a link in the thread you provided that demonstrates the use of xformers with a different dataset. The links are https://github.com/Bayes-Song/Open-Llama/blob/main/README_en.md | http://home.ustc.edu.cn/~sl9292/ | https://github.com/Bayes-Song/Open-Llama. This dataset was originally used with xformers for model training and the calculation works well for text-to-text interpretation for computation and generation. I will attempt to duplicate the repository on top of oobabooga, using the same setup they used for training but for generation purposes. In addition to using the --xformer command when launching the program, it is recommended to use the accelerate command to ensure that the torch/cuda link is taken into consideration. I am hoping to complete this task by tonight.

@AestheticMayhem

To complete this task, you can follow the steps below:

Clone the repository: First, clone the repository using the command git clone https://github.com/Bayes-Song/Open-Llama.git. This will create a local copy of the repository on your computer.

Set up the environment: Install the necessary dependencies for xformers and set up the environment for training and generation purposes.

Copy the dataset: Copy the dataset you want to use for training and generation to the appropriate location in the cloned repository.

Launch the program: Launch the program using the --xformer command and the accelerate command to ensure that the torch/cuda link is taken into consideration.

Test the program: Test the program to ensure that it is working as expected and generating the desired results.

By following these steps, you should be able to duplicate the repository and use a different dataset for training and generation purposes using xformers.

@MarkovInequality
Copy link
Contributor

#950

@github-actions github-actions bot added the stale label Nov 23, 2023
Copy link

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request stale
Projects
None yet
Development

No branches or pull requests

5 participants