How to modify hyperparameters for achiving highest accuraccy with Citrinet? #7723
Varuzhan97
started this conversation in
General
Replies: 1 comment
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi. I am trying to train the Armenian ASR model using the transfer learning technique from the English Citrinet (stt_en_citrinet_256) model. However, training is not going on correct way and the model is not achieving higher accuracy. Below are the hyperparameters of my training. Can you please help me to reduce them for target?
**
model is stt_en_citrinet_256,
tokenizer type is unigram,
vocab size is 1024,
freeze_encoder: False,
batch_size: 4,
use_start_end_token: True,
trim_silence: False,
learning rate: 0.001,
betas: [0.95, 0.5],
weight_decay: 0.001,
warmup_steps: None,
warmup_ratio: 0.1,
minimum learning rate: 1e-5,
spec_augment.freq_masks: 2,
spec_augment.freq_width: 27,
spec_augment.time_masks: 10,
spec_augment.time_width: 0.05,
epochs: 150,
dataset length: 34.73 hours.
**
Beta Was this translation helpful? Give feedback.
All reactions