Skip to content

Conversation

@sliorde
Copy link
Contributor

@sliorde sliorde commented Jun 22, 2022

The tutorial "Language Modeling with nn.Transformer and TorchText" contains
code snippets with variables named batch_size. The issue is that in some
places, batch_size means that number of sequences in a batch, and in other
places it means the number of tokens in each batch sequence. This inconsistency
was solved in this commit: batch_size was replaced with seq_len in the two
places where it has the latter meaning.

@netlify
Copy link

netlify bot commented Jun 22, 2022

Deploy Preview for pytorch-tutorials-preview ready!

Name Link
🔨 Latest commit eeff8d4
🔍 Latest deploy log https://app.netlify.com/sites/pytorch-tutorials-preview/deploys/62ea9a932786f80009fc9300
😎 Deploy Preview https://deploy-preview-1953--pytorch-tutorials-preview.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

The tutorial "Language Modeling with nn.Transformer and TorchText" contains
code snippets with variables named `batch_size`. The issue is that in some
places, `batch_size` means that number of sequences in a batch, and in other
places it means the number of tokens in each batch sequence. This inconsistency
was solved in this commit: `batch_size` was replaced with `seq_len` in the two
places where it has the latter meaning.
@svekars svekars merged commit d5f7a40 into pytorch:master Aug 3, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants