-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add pretrained relation extraction models #2492
Conversation
Thank you very much for adding this, can you provide some insights or a short description of the architecture of the relations extraction models? |
Hello @alejandrojcastaneira it we used a very simple architecture here: the |
@alanakbik does the embedding include some sort of positional embedding? I am asking since I have a use-case that for example in a sentence there are two persons and two birth_places. Can this Relation Extraction model be trained to distinguish between these two birth_place relations based on context of the sentence? |
Yes if you use contextual embeddings. |
@alanakbik @djstrong in my question I meant positional context not semantic context. Is there any paper or blog that shed some light to the FLAIR Relation Extraction? |
Semantic context is usually positional. Flair RE is extracting embeddings of the spans, concats them and pass through fully connected neural network. |
@djstrong is the 'embedding' the dynamic embedding of the span in the current sentence, or the pretrained embedding? |
You can choose embeddings (dynamic or static). |
This PR adds a few relation extraction models, trained over a modified version of TACRED. Two models are added:
relations
andrelations-fast
.To use these models, you also need an entity tagger. The tagger identifies entities, then the relation extractor possible entities. For instance use this code: