Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Examples: check max_position_embeddings in the translation example #29600

Merged
merged 1 commit into from
Mar 12, 2024

Conversation

gante
Copy link
Member

@gante gante commented Mar 11, 2024

What does this PR do?

Fixes #28104

When a model with absolute position embeddings (such as Marian) is used in the translation training example, we must confirm that the input sequence length argument is smaller than its maximum sequence length. Otherwise, the data gets prepared with a larger sequence length, resulting in indexing errors.

This PR adds an informative exception, explaining what can be fixed when that happens :)

[Note: models with relative position embeddings, like T5, don't need this check. They can handle arbitrary sequence lengths.]

@gante gante requested a review from amyeroberts March 11, 2024 19:53
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for digging into this and fixing!

@gante gante merged commit d479665 into huggingface:main Mar 12, 2024
8 checks passed
@gante gante deleted the marian_translation_script branch March 12, 2024 18:58
itazap pushed a commit that referenced this pull request May 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

CUDA Error running the Translation example due to embeddings
3 participants