Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: backport TypeAlias for py3.9 compat #1

Closed
wants to merge 15 commits into from

Conversation

stillmatic
Copy link

No description provided.

Copy link
Author

@stillmatic stillmatic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see commentary here: invoke-ai/InvokeAI#302 (comment)

@magnusviri
Copy link

@Birch-san the lstein repo isn't working because of this. invoke-ai/InvokeAI#335

@magnusviri
Copy link

@Birch-san yesterday somebody downgraded the lstein mac environment to python 3.9, that's why it crashes now. This pull request makes this repo work with 3.9.

I'm not sure if we will leave it at 3.9. They are discussing it here.

@stillmatic
Copy link
Author

I would note that this change is compatible with 3.9 and 3.10, so if this change is merged, you can use whichever version you please

@Birch-san
Copy link
Owner

Birch-san commented Sep 3, 2022

thanks! I'll integrate this. I'll also rebase onto latest (Katherine added a trivial optimization crowsonkb@4567328).

I think I can simplify clone_please to be a more sensible patch, now that I found the source of the ±inf problem.
pytorch/pytorch#84364
just need to .detach().clone() the result returned from utils.py::append_dims.

Unrelated to MPS, but the following two points pertain to the patches I've made for getting good results at low sample steps (e.g. 7 or 8)…
https://twitter.com/Birchlabs/status/1565029734865584143?s=20&t=lU-1bZCq0IejmzGM3YzVrg

I think I'll delete the decorate_sigma_hat option I added to the samplers (there's already an idiomatic way to do quantization in k-diffusion: CompVisDenoiser(model, quantize=True) — admittedly it doesn't work when churn is enabled, but nobody's using churn).
crowsonkb#23 (comment)
I anticipate that k-diffusion will update soon with an official way to quantize sigma_hat, which will mean we can get time-step discretization even when churn>0:
crowsonkb#23 (comment)

likewise I'll delete the concat_zero option I added to the Karras noise schedule (it's better to increase sigma_min than to remove the zero). the same custom noise schedule of mine can be achieved without any changes to k-diffusion.
crowsonkb#23 (comment)
Birch-san/stable-diffusion@50476fa

@Birch-san
Copy link
Owner

oh, TypeAlias was only being used because of my decorate_sigma_hat patch, which is unrelated to MPS and which I'll be removing (as explained above).

so, we don't need to force python 3.9 users to install typing-extensions. I can just delete TypeAlias.

TypeAlias wasn't that important anyway (more of a nice-to-have best-practice); the type inference would do the right thing even if I hadn't explicitly declared TensorOperator as a TypeAlias.

so I think what I'll do is delete the offending code instead of integrating this PR, if that's okay with you?

@stillmatic
Copy link
Author

sgtm, the easiest way to fix code is delete code!

@Birch-san
Copy link
Owner

thanks 🙂 I've now force-pushed a simpler branch to mps, which has no changes to sampling.py:
crowsonkb/k-diffusion@master...Birch-san:k-diffusion:mps

this removes the TypeAlias and therefore the need for this py3.9 mitigation.

anybody who's currently got my branch checked out, will not be able to pull because I force-pushed a new history. to get yourself up-to-date, cd into the k-diffusion repository (e.g. stable-diffusion/src/k-diffusion), fetch the latest mps branch and hard-reset your local mps to match the remote one:

git fetch origin mps
git reset --hard origin/mps

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants