This is the code base for our technical report Context-aware Decoding Reduces Hallucination in Query-focused Summarization, https://arxiv.org/pdf/2312.14335.pdf
sklearn
torchmetrics
transformers
datasets
evaluate
follow instructions in https://huggingface.co/docs/transformers/installation and do Editable Install
replace transformers/src/transformers/generation/utils.py
with generation/utils.py
-
Download the required datasets from this google drive link
-
Change the path in
./src/utils.py
accordingly -
Run sample bash scripts in
./src/bash_scripts
-
A detailed list of arguments can be found at
src/test_performance_decoder.py
andsrc/test_performance_encoder_decoder.py
Contact zhichao.xu@utah.edu if you have trouble running the code
@article{shi2023trusting,
title={Trusting Your Evidence: Hallucinate Less with Context-aware Decoding},
author={Shi, Weijia and Han, Xiaochuang and Lewis, Mike and Tsvetkov, Yulia and Zettlemoyer, Luke and Yih, Scott Wen-tau},
journal={arXiv preprint arXiv:2305.14739},
year={2023}
}
@article{xu2023context,
title={Context-aware Decoding Reduces Hallucination in Query-focused Summarization},
author={Xu, Zhichao},
journal={arXiv preprint arXiv:2312.14335},
year={2023}
}