This repository has been archived by the owner on Dec 16, 2022. It is now read-only.
RoBERTa on SuperGLUE's 'Recognizing Textual Entailment' task #4999
Labels
Contributions welcome
easy
Tasks that are relatively easy.
Models
Issues related to the allennlp-models repo
RTE is one of the tasks of the SuperGLUE benchmark. The task is to re-trace the steps of Facebook's RoBERTa paper (https://arxiv.org/pdf/1907.11692.pdf) and build an AllenNLP config that reads the RTE data and fine-tunes a model on it. We expect scores in the range of their entry on the SuperGLUE leaderboard.
We recommend you use the AllenNLP Repository Template as a starting point. It might also be helpful to look at how the TransformerQA model works (training config, reader, model).
The text was updated successfully, but these errors were encountered: