Skip to content

Latest commit

 

History

History
18 lines (11 loc) · 438 Bytes

README.md

File metadata and controls

18 lines (11 loc) · 438 Bytes

Attentions in Tacotron

This Repo is a collection for all Attentions in Tacotron to make a better Tacotron. :-)

Currently, following attentions are implemented:

  • Bahdanau Attention
  • Location Sensitive Attention (LSA)
  • Stepwise Monotonic Attention (SMA)
  • GMM Attention V0/V1/V2
  • Dynamic Convolution Attention (DCA)

Acknowlegements

This Repo is heavily referred from: