Skip to content

Files

Latest commit

62ee057 · May 17, 2024

History

History

transformer_RC

reading comprehension model by transformer

  • The Architecture of this model employed the transformer feature parallized attention + BiDAF query-wise Passage content state + PointerNetwork.

  • train Loss:loss

  • You may want to inspect the predict result in here

final_reuslt: Rouge-L:0.2651. BLEU_1: 0.36.

Result is keep updating, welcome to star and follow.