You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Time and computing resources required for UniST Pretrain and Prompt-tuning vary based on the number and size of datasets involved. In our experience, utilizing over 20 datasets typically necessitates approximately 8 hours for pretraining and an additional 12 hours for prompt-tuning.
Hello, how much time and computing resources are needed for UniST Pretrain and Prompt-tuning?
The text was updated successfully, but these errors were encountered: