Skip to content

Latest commit

 

History

History
14 lines (9 loc) · 1.19 KB

README.md

File metadata and controls

14 lines (9 loc) · 1.19 KB

Word-in-Context disambiguation

Open In Collab

Word-in-Context (WiC) disambiguation as a binary classification task using static word embeddings (i.e. Word2Vec and GloVe) to determine whether words in different contexts have the same meaning.

Implementation details

We propose a Bi-LSTM architecture with pre-trained word embeddings and test it against a simpler feed-forward neural network. For further insights, read the dedicated report or the presentation slides (pages 2-6).

Get the dataset

You may download the original dataset from here.

Test the model

For ready-to-go usage, simply run the notebook on Colab. In case you would like to test it on your local machine, please follow the installation guide. You may find the pre-trained models in model/pretrained.