title | emoji | colorFrom | colorTo | sdk | sdk_version | app_file | pinned | short_description |
---|---|---|---|---|---|---|---|---|
You Might Speak |
🌍 |
pink |
gray |
gradio |
5.9.1 |
app.py |
false |
I guess you might speak <Language> |
https://pytorch.org/tutorials/intermediate/char_rnn_classification_tutorial.html Last Accessed: 30th Dec 2024
The code is partially inspired by https://pytorch.org/tutorials/intermediate/char_rnn_classification_tutorial.html
Changes I Introduced
- NamesDataset is separated from transformation, useful for transformation during inference
- target is made integer instead of one-hot encoding;
- changed the loss from combination of LogSoftmax + NLLoss to CrossEntropy (EXACTLY THE SAME STUFF); which further required removing the softmax layer from the architecture.
- DataLoader is added
- Input made batch first > Corresponding RNN is also made batch first.
Although the code is mostly replicated. However, I changed the dataloader to use apply lowercase transformation to data, and it confused the model.
Notice that the diagonal brightness for the without lowercase, which can be said that the actual class being arabic (for example) is guess as arabic. This is not the case with with lowercase.