-
Notifications
You must be signed in to change notification settings - Fork 14
Description
Hello, may you please provide inference script for Oryx or at least the version of transformers you are using?
Receiving this error when attempting to run inference using Oryx:
Value error, The checkpoint you are trying to load has model type oryx_qwen but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
You can update Transformers with the command pip install --upgrade transformers. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command pip install git+https://github.com/huggingface/transformers.git [type=value_error, input_value=ArgsKwargs((), {'model': ...gits_processors': None}), input_type=ArgsKwargs]
Using Oryx from this hugging face repo: https://huggingface.co/THUdyh/Oryx-7B
Am I using the wrong version of Oryx?