Skip to content

Commit

Permalink
Support custom lr gamma
Browse files Browse the repository at this point in the history
  • Loading branch information
yqzhishen committed Feb 27, 2023
1 parent defc997 commit cc7b56d
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 3 deletions.
1 change: 1 addition & 0 deletions configs/acoustic/nomidi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,7 @@ gen_tgt_spk_id: -1
num_sanity_val_steps: 1
lr: 0.0004
decay_steps: 50000
gamma: 0.5
max_tokens: 80000
max_sentences: 48
val_check_interval: 2000
Expand Down
8 changes: 6 additions & 2 deletions pipelines/no_midi_preparation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1076,9 +1076,9 @@
"\n",
"These two parameters jointly determine the batch size at training time, the former representing maximum number of frames in one batch and the latter limiting the maximum batch size. Larger batches consumes more GPU memory at training time. This value can be adjusted according to your GPU memory. Remember not to set this value too low because the model may not converge with small batches.\n",
"\n",
"##### `lr` and `decay_steps`\n",
"##### `lr`, `decay_steps`, `gamma`\n",
"\n",
"These two values refer to the learning rate and number of steps everytime the learning rate decays. If you decreased your batch size, you may consider using a smaller learning rate and more decay steps.\n",
"The learning rate starts at `lr`, decays with the rate `gamma` at every `decay_steps` during training. If you decreased your batch size, you may consider using a smaller learning rate and more decay steps, or larger gamma.\n",
"\n",
"##### `val_check_interval`, `num_ckpt_keep` and `max_updates`\n",
"\n",
Expand Down Expand Up @@ -1137,6 +1137,7 @@
"\n",
"lr = 0.0004\n",
"decay_steps = 50000\n",
"gamma = 0.5\n",
"\n",
"val_check_interval = 2000\n",
"num_ckpt_keep = 5\n",
Expand Down Expand Up @@ -1185,6 +1186,7 @@
" 'max_sentences': max_sentences,\n",
" 'lr': lr,\n",
" 'decay_steps': decay_steps,\n",
" 'gamma': gamma,\n",
" 'val_check_interval': val_check_interval,\n",
" 'num_valid_plots': min(10, len(test_prefixes)),\n",
" 'num_ckpt_keep': num_ckpt_keep,\n",
Expand Down Expand Up @@ -1411,6 +1413,7 @@
"\n",
"lr = 0.0004\n",
"decay_steps = 50000\n",
"gamma = 0.5\n",
"\n",
"val_check_interval = 2000\n",
"num_ckpt_keep = 5\n",
Expand Down Expand Up @@ -1485,6 +1488,7 @@
" 'max_sentences': max_sentences,\n",
" 'lr': lr,\n",
" 'decay_steps': decay_steps,\n",
" 'gamma': gamma\n",
" 'val_check_interval': val_check_interval,\n",
" 'num_valid_plots': min(20, len(test_prefixes)),\n",
" 'num_ckpt_keep': num_ckpt_keep,\n",
Expand Down
2 changes: 1 addition & 1 deletion src/task.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ def validation_step(self, sample, batch_idx):
return outputs

def build_scheduler(self, optimizer):
return torch.optim.lr_scheduler.StepLR(optimizer, hparams['decay_steps'], gamma=0.5)
return torch.optim.lr_scheduler.StepLR(optimizer, hparams['decay_steps'], gamma=hparams.get('gamma', 0.5))

def optimizer_step(self, epoch, batch_idx, optimizer, optimizer_idx):
if optimizer is None:
Expand Down

0 comments on commit cc7b56d

Please sign in to comment.