Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v0.1.2 #135

Merged
merged 18 commits into from
Jan 29, 2023
Merged

v0.1.2 #135

merged 18 commits into from
Jan 29, 2023

Conversation

cloneofsimo
Copy link
Owner

No description provided.

@Dango233
Copy link

I was just about to ask any plan for adding resnet ... and you did it!

Meng Zhang and others added 4 commits January 15, 2023 23:13
Remove useless model_id declaration in svd_distill
Avoid using python 3.9 feature ( |= for dict ) so cli_pt_to_safetenso…
@cloneofsimo
Copy link
Owner Author

Ah, I should make SVD distillation compatible with Conv layer as well. That should def boost up the performance.

cloneofsimo and others added 6 commits January 17, 2023 04:03
* feat basic preprocessing pipelines

* comments + mediapipe face mask

* simple CLI for easily pre-processing dataset

* CLI for masking + captioning outputs

* dataset (backward compatible with face seg)

* just better stuff in general :)

* temp, moving blur to dataset

* captioning + PTI training working

* duplicate mask behavior

* mask working as intended as well
grammar / nitpick: twice as faster => twice as fast
…nt updates (#140)

* feat : much more canonical structured pruning

* update contents + notebooks

* wavy's analog diffusion

* futureproof lora_add?

* fix : max caption length

* analog ranks

* mask temperature

* preprocess with length

* default non token for placeholder

* req

* addition finally working, this took so long...

* null as option

* bit more bug fix, remove verbose

* weight apply depreciated, ckpt conversion for A1111
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants