Solutions for CS224n, winter, 2019.
Welcome to discuss problems appearing in assigments, please submit to issue.
Also take notes for the key point in lectures.
The solutions for assignment is written by Markdown in Assignments/written part.
- Course page: https://web.stanford.edu/class/cs224n
- Video page: https://www.youtube.com/watch?v=8rXD5-xhemo&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z
After CS224n I realize that more systematical training is needed. So I start a new repo learn_NLP_again, here is the description(algorithms and solutions is available for chapter 1 until now):
Here is why I started this project: learn NLP from scratch again. I choose Speech and language process as my entry point, and try to write solutions and implement some algorithms/models of this book. I hope I can stick to this project and update frequently.
After one year's training in corporation and lab, I find many faults or incorrect habbits in past parctice, (btw, there is too many commits in this repo). I'll review the code in this repo and solve issues gradually.(:smile:, hopefully)
Welcome communications in new repo!
- note: Word Vectors I: Introduction, SVD and Word2Ve
- Word2Vec Tutorial - The Skip-Gram Model
- coding: Assignment1
- Gensim
- note: Word Vectors II: GloVe, Evaluation and Trainin
- gradient-notes
- CS231n notes on backprop
- review-differential-calculus
- backprop_old
- CS231n notes on network architectures
- coding: Assignment2
- writing: Assignment2
- note: Dependency Parsing
- note: Language Models and Recurrent Neural Network
- a3
- coding: Assignment3
- writing: Assignment3
- note: Machine Translation, Sequence-to-sequence and Attention
- a4
- read: Attention and Augmented Recurrent Neural Networks
- read: Massive Exploration of Neural Machine Translation Architectures (practical advice for hyperparameter choices)
- coding: Assignment4
- writing: Assignment4
How to understand pack_padded_sequence and pad_packed_sequence?
(Chinese ed)
(English ed)
It has been long time for no updating...
- note: Machine Translation, Sequence-to-sequence and Attention
- a4
- read: Attention and Augmented Recurrent Neural Networks
- coding: Assignment5
- writing: Assignment5
reading:
- final-project-practical-tips
- default-final-project-handout
- project-proposal-instructions
- Practical Methodology_Deep Learning book chapter
- Highway Networks
- Bidirectional Attention Flow for Machine Comprehension
practice:
- anotate codes
- train baseline