-
Notifications
You must be signed in to change notification settings - Fork 109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dev quant tools and fix graph file management #495
Conversation
ccssu
commented
Jan 5, 2024
•
edited
Loading
edited
- 使用文档 https://github.com/siliconflow/sd-team/issues/193#issuecomment-1893290387
- https://huggingface.co/siliconflow/stable-diffusion-v1-5-onediff-enterprise-v1
mse = torch.mean((org_latent_sample - cur_latent_sample) ** 2) | ||
|
||
ssim = 1 - mse |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MSE : (原始模型生成的潜在样本空间 vs 量化模型其中一层生成的潜在样本空间)
sdxl 量化结果实验 可以使用 1-mse 代表 比较重建图片的 ssim 值。
sdxl_base_1_0_key_ssim_mse.txt
模型的参数名称 ssim 值 1-mse 值
time_embed.0 0.9756851169369525 0.9671265296638012
time_embed.2 0.9616645683538984 0.8562246561050415
label_emb.0.0 0.97451926291366 0.9558620862662792
下图所有 ssim < 0.97 的值 ( ssim = 原始模型生成图片 vs 量化模型其中一层生成的图片)
ssim 值 和 1- mse 值 变化趋势是一致的,具有相关性。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
看起来mse指标分布更广,区分度更高一些,可以尝试用mse替换ssim量化一下sdxl + deepcache看看效果
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
count = len( | ||
[v for v in args_tree.iter_nodes() if isinstance(v, flow.Tensor)] | ||
) | ||
return f"{graph_file}_{count}.graph" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
选择在 f"{graph_file} 后加个输入参数的数量。 @strint
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
注意点2: 在 ControlNet
节点中,strength
参数不支持值为0 和大于0的之间的切换。 0 代表相当于取消 ControlNet
作用。 大于0的值代表使用 ControlNet
。 请不要在 0 和 大于0的之间切换,否则会出现错误。
if index % 10 == 0 or index == length - 1: | ||
torch.save(calibrate_info, cached_process_output_path) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个cache行为需要用户显式指定,否则可能会导致写过多的文件和目录?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个cache行为需要用户显式指定,否则可能会导致写过多的文件和目录?
这里 cache 对应 onediff-quant 量化脚本中 quantize_info_and_relevance_metrics.pt 文件。
是加个 cached_process_output:["enable","disable"] ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
我是觉得comfy的工具不一定需要支持cache,如果是为了让用户可以调那几个量化参数,大不了把calibrate_info在内存里复制一份,每次改完参数都生成一个新的calibrate_info输出到用户指定的目录,而原始的calibrate_info是不会被修改的。
如果非要cache的话,我有几个问题:
- 切换模型会不会导致被误用,如何判断cache是否会被命中?
- cache的目录什么时候被清空,需不需要用户手动操作,用户如何知道该清空哪个目录?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
) | ||
|
||
calibrate_info[sub_name] = { | ||
"ssim": similarity_mse, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ssim就叫mse会比较准确
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
有个问题,mse 是越大两个图越不相似。而这里的 similarity_mse
( 1-mse
)、以及真正的 ssim 是数值越大,两个图越相似。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
那就叫max_mse_loss?loss会比较容易让人知道这个值越小越好
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
我觉得可以,mse就是越小越相似,默认值改成0.1