reading comprehension model by transformer
-
The Architecture of this model employed the transformer feature parallized attention + BiDAF query-wise Passage content state + PointerNetwork.
-
You may want to inspect the predict result in here
final_reuslt: Rouge-L:0.2651. BLEU_1: 0.36.
Result is keep updating, welcome to star and follow.