We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"@huggingface/transformers": "^3.3.2",
When trying to parse messages using AutoTokenizer.from_pretrained with the model deepseek-ai/DeepSeek-R1-Distill-Llama-8B, I receive the error
AutoTokenizer.from_pretrained
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
Failed to get test result: Error: Parser Error: Expected closing statement token. OpenSquareBracket !== CloseStatement.
Others were seeing similar errors in other DeepSeek models but I'm not sure if that affects this package
https://huggingface.co/mlx-community/deepseek-r1-distill-qwen-1.5b/discussions/1
async function loadTokenizer() { const tokenizer = await AutoTokenizer.from_pretrained('deepseek-ai/DeepSeek-R1-Distill-Llama-8B'); return tokenizer; } const tokenizer = await loadTokenizer(); const prompt = tokenizer.apply_chat_template([{ role: 'user', content: 'How are you today?' }], { tokenize: false, add_generation_prompt: true })
An error will be thrown when tokenizer.apply_chat_template is called.
tokenizer.apply_chat_template
The text was updated successfully, but these errors were encountered:
Hi! This will be fixed by huggingface/huggingface.js#1142.
New release coming soon.
Sorry, something went wrong.
No branches or pull requests
System Info
"@huggingface/transformers": "^3.3.2",
Environment/Platform
Description
When trying to parse messages using
AutoTokenizer.from_pretrained
with the modeldeepseek-ai/DeepSeek-R1-Distill-Llama-8B
, I receive the errorOthers were seeing similar errors in other DeepSeek models but I'm not sure if that affects this package
https://huggingface.co/mlx-community/deepseek-r1-distill-qwen-1.5b/discussions/1
Reproduction
An error will be thrown when
tokenizer.apply_chat_template
is called.The text was updated successfully, but these errors were encountered: