KDD19 Tutorial: From Shallow to Deep Language Representations: Pre-training, Fine-tuning, and Beyond
Presenters: Aston Zhang, Haibin Lin, Leonard Lausen, Sheng Zha, and Alex Smola
Other contributors: Chenguang Wang and Mu Li
Natural language processing (NLP) is at the core of the pursuit for artificial intelligence, with deep learning as the main powerhouse of recent advances. Most NLP problems remain unsolved. The compositional nature of language enables us to express complex ideas, but at the same time making it intractable to spoon-feed enough labels to the data-hungry algorithms for all situations. Recent progress on unsupervised language representation techniques brings new hope. In this hands-on tutorial, we walk through these techniques and see how NLP learning can be drastically improved based on pre-training and fine-tuning language representations on unlabelled text. Specifically, we consider shallow representations in word embeddings such as word2vec, fastText, and GloVe, and deep representations with attention mechanisms such as BERT. We demonstrate detailed procedures and best practices on how to pre-train such models and fine-tune them in downstream NLP tasks as diverse as finding synonyms and analogies, sentiment analysis, question answering, and machine translation. All the hands-on implementations are with Apache (incubating) MXNet and GluonNLP, and part of the implementations are available on Dive into Deep Learning.
Time | Tutor | Title | Slides | Notebooks |
---|---|---|---|---|
9:30am-10:00am | Alex Smola | Basics of hands-on deep learning | basics | ndarray, autograd |
10:00am-11:00am | Alex Smola | Neural Networks | model, cnn-rnn, seq, rnn | |
11:00am-11:10am | Coffee break | |||
11:10am-11:30am | Aston Zhang | Shallow language representations in word embedding | shallow-models | |
11:30am-12:30pm | Aston Zhang | Application | sim-analogy-sentiment-analysis-rnn-cnn | |
12:30pm-1:00pm | Lunch break | |||
1:00pm-2:20pm | Leonard Lausen | Transformer | transformer | transformer |
2:20pm-2:30pm | Coffee break | |||
2:30pm-3:30pm | Haibin Lin | Deep language representations with Transformer (BERT) | bert | |
3:30pm-4:00pm | Haibin Lin | Application | sentiment-bert |
- Local installation guide
- Source code of the
d2l
(v0.10.1) package - This tutorial is based on Dive into Deep Learning and GluonNLP.