You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm now trying to fine-tune the pretrained model on my own ATAC-seq data using HeadAdapterWrapper. Due to device limitations and long training time, I set finetune_last_n_layers_only to 2, and use Adam as optimizer. I use CosineAnnealingLR as learning rate scheduler, and set my learning rate from 1e-3 to 1e-5. However, the result seems not good, so may I have your learning rate, optimizer and other hyper-parameters? Thanks very much.
The text was updated successfully, but these errors were encountered:
Hi, I'm now trying to fine-tune the pretrained model on my own ATAC-seq data using HeadAdapterWrapper. Due to device limitations and long training time, I set finetune_last_n_layers_only to 2, and use Adam as optimizer. I use CosineAnnealingLR as learning rate scheduler, and set my learning rate from 1e-3 to 1e-5. However, the result seems not good, so may I have your learning rate, optimizer and other hyper-parameters? Thanks very much.
The text was updated successfully, but these errors were encountered: