You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a question regarding the autoformer-tiny model mentioned in the README.md of Autoformer.
When I downloaded the file, it was named supernet-tiny.pth, leading me to believe that it is a supernet trained with the following configurations: head_num: 4, layer_num: 14, and embed_dim: 240(256). However, after examining the weight matrix of the file, it doesn't seem to match these specifications.
Could you please clarify if the autoformer-tiny is indeed a supernet? If not, can you provide more details about the specific structure options used to train this model?
Hello,
I have a question regarding the autoformer-tiny model mentioned in the README.md of Autoformer.
When I downloaded the file, it was named
supernet-tiny.pth
, leading me to believe that it is a supernet trained with the following configurations: head_num: 4, layer_num: 14, and embed_dim: 240(256). However, after examining the weight matrix of the file, it doesn't seem to match these specifications.Could you please clarify if the autoformer-tiny is indeed a supernet? If not, can you provide more details about the specific structure options used to train this model?
https://github.com/microsoft/Cream/blob/main/AutoFormer/experiments/subnet/AutoFormer-T.yaml
Or, is it a subnet sampled from the supernet with the above configuration?
Thank you for your assistance.
The text was updated successfully, but these errors were encountered: