We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,首先感谢您的工作 如题,我打印出来了剪枝之后的模型参数和微调之后的模型参数,发现微调之后模型的channel又恢复到了原模型每个卷积层的channel,所以微调之后保存的模型就变大了,不知道是我的操作问题还是代码就是这个样子的呢
The text was updated successfully, but these errors were encountered:
如果你用的是mask implementation的话是有可能变大,因为我们加了新的channel selection layer。
Sorry, something went wrong.
感谢您的答疑,我大概了解了您的思路,我还有一个疑问: 我finetune之后的参数量是比剪枝之后的参数量大的(resnet剪枝)(我的参数量大概从400k->500k+),但是从您的实验结果来看,您对resnet剪枝和finetune之后的参数量是一样的,请问这是为什么呢?
finetune不会改变参数量,都是同一个结构
您好!请问关于微调后模型大小变大的问题您是如何解决的呢? 期待您的回复
No branches or pull requests
您好,首先感谢您的工作
如题,我打印出来了剪枝之后的模型参数和微调之后的模型参数,发现微调之后模型的channel又恢复到了原模型每个卷积层的channel,所以微调之后保存的模型就变大了,不知道是我的操作问题还是代码就是这个样子的呢
The text was updated successfully, but these errors were encountered: