This repository contains the source code for our paper ITER, accepted at EMNLP 2024.
To set up the repository, the following basic steps are required:
python3 -m venv venv && source venv/bin/activate # optional
pip install git+https://github.com/fleonce/iter
bash scripts/datasets/load_datasets.sh
python3 train.py --transformer t5-small --dataset {ace05,ade,conll03,conll04,genia,scierc}
where the transformer and dataset arguments have the following possible values:
Currently working transformer models are:
- t5-{small,base,large,3b,11b}
- google/t5-v1_1-{small,base,large,xl,xxl}
- google/flan-t5-{small,base,large,xl,xll}
- bert-large-cased
- microsoft/deberta-v3-{xsmall,small,base,large}
- microsoft/deberta-v2-{xlarge,xxlarge}
Currently supported datasets are:
- ace05
- ade
- conll03
- conll04
- genia
- scierc
To evaluate the checkpoints we provided, simply use the following command:
python3 evaluate.py --model {checkpoint}
We publish checkpoints for the models performing best on the following datasets: