From b523cd7931baef1ef5b7de847bb076b26b70106b Mon Sep 17 00:00:00 2001 From: Abhishek Lomsh Date: Mon, 15 Jul 2024 17:53:09 +0530 Subject: [PATCH] Update: Fixed typo transformer-embeddings.md Fixed typo in ~/docs/tutorial/tutorial-embeddings/transformer-embeddings.md --- docs/tutorial/tutorial-embeddings/transformer-embeddings.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/tutorial/tutorial-embeddings/transformer-embeddings.md b/docs/tutorial/tutorial-embeddings/transformer-embeddings.md index eae5eb35bf..f7c2631145 100644 --- a/docs/tutorial/tutorial-embeddings/transformer-embeddings.md +++ b/docs/tutorial/tutorial-embeddings/transformer-embeddings.md @@ -111,7 +111,7 @@ torch.Size([1536]) torch.Size([9984]) ``` -I.e. the size of the embedding increases the mode layers we use (but ONLY if layer_mean is set to False, otherwise the length is always the same). +I.e. the size of the embedding increases the more layers we use (but ONLY if layer_mean is set to False, otherwise the length is always the same). (pooling)= ### Pooling operation