You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@jinmin527 你好,想问问你是怎么量化保存的,我是在微调训练的get_model之后添加的BMQuant.quantize(model, quantization_config)进行量化,能够保存模型,但是加载该模型推理时报了ValueError: Missing some parameters in layer 1
通过BMCook进行模型压缩,配置了quantization和distillation,训练的loss收敛的很好。但是保存模型的时候,发现保存的checkpoint文件并没有减少,分析发现线性层的参数还是fp32的。
另外bmtrain的优化器AdamOffloadOptimizer和AdamOptimizer也只支持参数保存为fp32和fp16,并没有实现参数保存为int8。
The text was updated successfully, but these errors were encountered: