You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have tested official Xformers for 2.5.1 vs just PyTorch, and speed was actually in negatives, eg. slower than PyTorch. That said, I do have old GPU, it could be different on something newer as I can use only torch attn v1.
I have build myself Xformers 0.29 for nightly torch 2.6 (which btw. seems to work with any nightly version, currently I have 2.6.0.dev20241212). And speed, well its same? Altho I suspect somehow Xformers have some edge when it comes to image quality.
That said, I wonder, why is 0.29 I built so big? Resulting file is 446MB.. thats a lot bigger than last 0.28. Did I built it wrong? I mean, it works fine..
The text was updated successfully, but these errors were encountered:
Benchmark is simply running same ComfyUI workflow with everything locked. One can gauge difference between sampler, or in this case between xFormers and PyTorch.
So for the Titan kernels, we didn't change them in a while, and they are available as part of PyTorch now. You will get exactly the same speed/result with PyTorch's scaled_dot_product_attention :)
bertmaher
pushed a commit
to bertmaher/xformers
that referenced
this issue
Dec 20, 2024
❓ Questions and Help
I have tested official Xformers for 2.5.1 vs just PyTorch, and speed was actually in negatives, eg. slower than PyTorch. That said, I do have old GPU, it could be different on something newer as I can use only torch attn v1.
I have build myself Xformers 0.29 for nightly torch 2.6 (which btw. seems to work with any nightly version, currently I have 2.6.0.dev20241212). And speed, well its same? Altho I suspect somehow Xformers have some edge when it comes to image quality.
That said, I wonder, why is 0.29 I built so big? Resulting file is 446MB.. thats a lot bigger than last 0.28. Did I built it wrong? I mean, it works fine..
The text was updated successfully, but these errors were encountered: