-
Notifications
You must be signed in to change notification settings - Fork 137
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue Encountered When Loading the Model: "pretrain_videomae_base_patch16_224" #105
Comments
Hello, I have encountered the same issue as you. Indeed, the model |
Hello,do you solve the problem? |
It has been quite some time, and I apologize for the delay in my response. I recently came across your question on GitHub, and after reviewing my previous work, I have found some information that might be helpful to you. Based on my recollection, it seems that I made modifications to the code responsible for loading the model in the "run_mae_pretraining.py" file. Specifically, I commented out the line model = get_model(args) and instead used the following code:
These adjustments allowed me to successfully run the VideoMAE code, and I achieved promising experimental results. I hope this information proves useful to you in your own work. |
这个貌似解决不了,请问还有其他的办法吗? |
Dear [Support Team],
I hope this message finds you well. First of all, I would like to express my appreciation for your work. I have encountered a problem when trying to reproduce your code and load a model using the following line:
model = get_model(args)
.The default model specified in args is pretrain_videomae_base_patch16_224. However, when running the code, I received an error stating that the model could not be found: RuntimeError:
Unknown model (pretrain_videomae_base_patch16_224)
.I am currently using timm version
0.4.12
. When I attempt to search for the model usingmodel_vit = timm.list_models('*videomae*'),
it does not appear in the results.I kindly request your assistance in resolving this issue. Any guidance or suggestions would be greatly appreciated.
Thank you very much for your attention and support.
The text was updated successfully, but these errors were encountered: