Skip to content

Commit

Permalink
Fix in Reformer Config documentation (#5138)
Browse files Browse the repository at this point in the history
  • Loading branch information
erickrf authored Jun 19, 2020
1 parent 84be482 commit e33929e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/configuration_reformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ class ReformerConfig(PretrainedConfig):
Number of following neighbouring chunks to attend to in LocalSelfAttention layer in addition to itself.
local_attention_probs_dropout_prob (:obj:`float`, optional, defaults to 0.1):
The dropout ratio for the attention probabilities in LocalSelfAttention.
lsh_chunk_length (:obj:`int`, optional, defaults to 64):
lsh_attn_chunk_length (:obj:`int`, optional, defaults to 64):
Length of chunk which attends to itself in LSHSelfAttention. Chunking reduces memory complexity from sequence length x sequence length (self attention) to chunk length x chunk length x sequence length / chunk length (chunked self attention).
lsh_num_chunks_before (:obj:`int`, optional, defaults to 1):
Number of previous neighbouring chunks to attend to in LSHSelfAttention layer to itself.
Expand Down

0 comments on commit e33929e

Please sign in to comment.