From e7f580e33951f72059409c8681a6c60da79cbfdc Mon Sep 17 00:00:00 2001 From: Yusuke Mori Date: Mon, 26 Oct 2020 23:49:08 +0900 Subject: [PATCH] Minor error fix of 'bart-large-cnn' details in the pretrained_models doc --- docs/source/pretrained_models.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/pretrained_models.rst b/docs/source/pretrained_models.rst index 517ca7cd2d79fa..297eb2b18185c0 100644 --- a/docs/source/pretrained_models.rst +++ b/docs/source/pretrained_models.rst @@ -329,7 +329,7 @@ For a list that includes community-uploaded models, refer to `https://huggingfac | | ``facebook/bart-large-mnli`` | | Adds a 2 layer classification head with 1 million parameters | | | | | bart-large base architecture with a classification head, finetuned on MNLI | | +------------------------------------------------------------+---------------------------------------------------------------------------------------------------------------------------------------+ -| | ``facebook/bart-large-cnn`` | | 12-layer, 1024-hidden, 16-heads, 406M parameters (same as base) | +| | ``facebook/bart-large-cnn`` | | 24-layer, 1024-hidden, 16-heads, 406M parameters (same as large) | | | | | bart-large base architecture finetuned on cnn summarization task | +--------------------+------------------------------------------------------------+---------------------------------------------------------------------------------------------------------------------------------------+ | DialoGPT | ``DialoGPT-small`` | | 12-layer, 768-hidden, 12-heads, 124M parameters |