Skip to content

Commit

Permalink
fix some typos and add final mark
Browse files Browse the repository at this point in the history
  • Loading branch information
Simone-Alghisi committed Sep 11, 2022
1 parent 1c96c58 commit dd926f7
Show file tree
Hide file tree
Showing 3 changed files with 9,461 additions and 9,445 deletions.
15 changes: 13 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,29 @@
<!-- omit in toc -->
# TOC
- [Language Modelling](#language-modelling)
- [Project description](#project-description)
- [Results](#results)
- [Final Mark](#final-mark)

# Language Modelling
Repository containing the results of the Natural Language Understanding course project about Language Modelling.

## Project description
The proposed task of Language Modelling (LM) for the NLU course required to:

1. implement a Language Model using one of the RNN architectures (eg. Vanilla, LSTM, GRU);
2. train it and evaluate its performance on the word-level Penn Treebank (PTB) dataset;
2. train it and evaluate its performance on the word-level Penn Treebank (PTB) dataset;
3. reach a baseline value of 140 PP using a Vanilla RNN, or 90.7 PP using an LSTM.

## Results
As a starting point, I decided to implement a very basic model made of:
- a neural embedding layer;
- an LSTM, to capture context information;
- a fully connected layer, for the final word prediction; and obtained 137 PP.

To improve such results, I have considered the techniques described by Merity et. al, reaching 81.43 PP.

![](./report/assets/run_results_best.png)
![](./report/assets/run_results_best.png)

## Final Mark
The Examination Board gave me a full mark for my project (**30 Cum Laude**).
Loading

0 comments on commit dd926f7

Please sign in to comment.