Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor G2P module and related process #5

Merged
merged 3 commits into from
Dec 3, 2023

Conversation

lmxue
Copy link
Collaborator

@lmxue lmxue commented Dec 3, 2023

Refactor code related to g2p, including:

  • modules and utils related to g2p
  • config settings
  • dataset
  • model input
  • model inference

Copy link
Collaborator

@zhizhengwu zhizhengwu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this refactor impacts many modules. Is it tested? We need to make sure training and inference work, and performance is not degraded.

return packed_batch_features


#===================== Deprecated Code ============================
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If deprecated, can delete them.

"batch_size": self.cfg.train.batch_size,
}
return state_dict
# def _get_state_dict(self):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not used? If so, delete the commented lines

ckpt_path = os.path.join(path, "epoch-{:04d}.pt".format(self.epoch))
state_dict = self._get_state_dict()
torch.save(state_dict, ckpt_path)
# ckpt_path = os.path.join(path, "epoch-{:04d}.pt".format(self.epoch))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete commented lines

ckpt_path = os.path.join(path, "epoch-{:04d}.pt".format(self.epoch))
state_dict = self._get_state_dict()
torch.save(state_dict, ckpt_path)
# ckpt_path = os.path.join(path, "epoch-{:04d}.pt".format(self.epoch))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete commented lines

)
unique_tokens = SymbolTable.from_file(text_token_path)
text_tokenizer = TextToken(unique_tokens.symbols, add_bos=True, add_eos=True)
# ### get text tokenizer
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete commented lines

prompt_phone_id_seq, prompt_phn_len = text_tokenizer.get_token_id_seq(utt_info["Prompt_phone"])
self.utt2seq[utt] = phone_id_seq
self.utt2pmtseq[utt] = prompt_phone_id_seq
# if cfg.preprocess.phone_extractor == 'lexicon':
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

too many commented lines. We need to delete them!

def _build_model(self):

# if self.cfg.preprocess.phone_extractor == 'lexicon':
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete commented lines

@lmxue
Copy link
Collaborator Author

lmxue commented Dec 3, 2023

this refactor impacts many modules. Is it tested? We need to make sure training and inference work, and performance is not degraded.

  • Of course, I have tested the refactored code on VITS and VALL-E, which works well.
  • Also, the refactor code is directly correlated to TTS and tries to disentangle TTS from SVC and TTA. So It may not directly impact SVC and TTS.
  • The deprecated and unused codes have been deleted.

Copy link
Collaborator

@RMSnow RMSnow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

approved

@RMSnow RMSnow merged commit 3e829f0 into open-mmlab:main Dec 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants