Skip to content

Conversation

@kevin36524
Copy link

Fix Issue #112

Some HF models like mistral needs HUGGING_FACE_HUB_TOKEN to be set for the tokenizer and model config to be downloaded.

Currently we are getting this generic error if the HUGGING_FACE_HUB_TOKEN is not set.

(venv) kevin@Mac swift-transformers % swift run transformers "Best recommendations for a place to visit in Paris in August 2024:" --max-length 128 Examples/Mistral7B/StatefulMistral7BInstructInt4.mlpackage

Building for debugging...
[1/1] Write swift-version-39B54973F684ADAB.txt
Build of product 'transformers' complete! (0.17s)
Compiling model Examples/Mistral7B/StatefulMistral7BInstructInt4.mlpackage -- file:///Users/arforgeqa/Projects/swift-transformers/
Loading model file:///var/folders/4x/2n2whrg17s7g6s9xcr67tmt80000gn/T/StatefulMistral7BInstructInt4_A3224709-606B-4B0E-B497-34BD3B7C7FA1.mlmodelc
Generating
Error: The data couldn’t be read because it isn’t in the correct format.

This PR add some nice error logs to give more hints to the user.

New Output

kevin@Mac swift-transformers % swift run transformers "Write me an email wishing merry christmas" --max-length 20 ../kmodels/StatefulMistral7BInstructInt4.mlpackage
Building for debugging...
[11/11] Applying transformers
Build of product 'transformers' complete! (1.63s)
Compiling model ../kmodels/StatefulMistral7BInstructInt4.mlpackage -- file:///Users/kevin/projects/swift-transformers/
Loading model file:///var/folders/wl/tr847nmj2sz6b_w6qyyxmbr80000gn/T/StatefulMistral7BInstructInt4_8F1E98A9-413D-443D-B9FD-6F7A6B2AAA19.mlmodelc
Generating
Error: JSON Serialization failed for file:///Users/kevin/Documents/huggingface/models/mistralai/Mistral-7B-Instruct-v0.3/config.json. Please verify that you have set the HUGGING_FACE_HUB_TOKEN environment variable.

Copy link
Member

@pcuenca pcuenca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! Conceptually looks good to me, but waiting a bit to merge:

  • I think HUGGING_FACE_HUB_TOKEN will not be supported until #111 is merged (it's also in the preview branch you used).
  • I'm slightly concerned about potential clients of HubApi receiving a different error now, but I suppose it should be ok.
  • I'd like to run the tests offline. We are planning to restore CI next week, hopefully, but meanwhile I'll do it locally.

@pcuenca
Copy link
Member

pcuenca commented Aug 19, 2024

Tests pass locally. Thanks again @kevin36524!

@pcuenca pcuenca merged commit f4ab454 into huggingface:preview Aug 19, 2024
@kevin36524
Copy link
Author

That's great thanks @pcuenca

pcuenca added a commit that referenced this pull request Sep 24, 2025
* Added error for JSON serialization errors

* Fix merge commit

---------

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
pcuenca added a commit that referenced this pull request Sep 25, 2025
* feat: preview

* Remove Random (#115)

* Throwing error when the configs fail JSON serialization (#114)

* Added error for JSON serialization errors

* Fix merge commit

---------

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* Allow archiving for Mac (#121)

SPM dependencies are always compiled for the standard architectures, but
Float16 is not available for `x86_64`.

Thanks @joshnewnham for the workaround 🙌

* chore: strategic deletes avoid OOM

* Remove RepetitionPenaltyWarper, fix build

* Remove GenerationTests

* Restore TokenizerError

* Fix deprecation warnings in tests

* Move transformers-cli to an example

* Format

* Relax requirements for main package

But keep iOS 18 / macOS 15 for Core ML

* Revert platform requirements

* Relative package location plus comment

* Mistral example: uv-ify and unpin

* Remove obsolete GenerationTests again

---------

Co-authored-by: FL33TW00D <FL33TW00D@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants