Materials for the University of Turku course TKO_8965 Deep Learning in Human Language Technology (previously named TKO_2101 Natural Language Processing).
Bag of words text classification with neural networks. On the lectures, we work our way through basic neural network models, their training, and application to classification.
- Bag-of-words text classification - notebook
- Classifier word vector analysis - notebook
- Word embeddings - notebook
- BoW classifier with pretrained word embeddings - notebook
Convolutional neural networks and their use in natural language processing.
- Convolutional neural networks - slides
- Sequence to Label with CNNs - notebook
- CNN filter interpretation - notebook
- Pytorch CNN model - notebook
Introduction to recurrent neural networks and applications to various NLP tasks.
- Recurrent neural networks - slides
- Long short-term memory - slides
- Text classification with recurrent neural networks - notebook
- Text generation with recurrent neural networks - notebook
- Named entity recognition with recurrent neural networks - notebook
Encoder-decorer and sequence-to-sequence architectures and introduction to neural attention.
- Sequence to sequence and neural attention - slides
- Sequence to sequence date normalization - notebook
- Sequence to sequence English to katakana translation - notebook
- Neural machine translation with attention - TensorFlow tutorial
Self-attention, transformer model, and deep transfer learning.
- Transformer and transfer learning - slides
- Deep neural language models - slides
- Text classification with BERT - notebook
- Sequence labeling with BERT - notebook
NLP applications of neural networks and evaluation of NN models.