Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hugging Face Model #4

Open
Kikzter opened this issue Feb 20, 2024 · 3 comments
Open

Hugging Face Model #4

Kikzter opened this issue Feb 20, 2024 · 3 comments

Comments

@Kikzter
Copy link

Kikzter commented Feb 20, 2024

I am not able to find the model on Hugging Face. Can you help me out how to find the model in Hugging Face? Also Gradio is not working for me as well

@qijimrc
Copy link
Collaborator

qijimrc commented Feb 20, 2024

I am not able to find the model on Hugging Face. Can you help me out how to find the model in Hugging Face? Also Gradio is not working for me as well

Hi, please get the updated links of our HuggingFace models in the README. For the Gradio demo, as I answered in this Issue, you may follow the requirements.txt to ensure the proper versions of pydantic and gradio (can use --no-deps to ignore conflicts).

@Kikzter
Copy link
Author

Kikzter commented Feb 20, 2024

I am not able to find the model on Hugging Face. Can you help me out how to find the model in Hugging Face? Also Gradio is not working for me as well

Hi, please get the updated links of our HuggingFace models in the README. For the Gradio demo, as I answered in this Issue, you may follow the requirements.txt to ensure the proper versions of pydantic and gradio (can use --no-deps to ignore conflicts).

I checked the readme. where I got this line of code "python cli_demo_hf.py --from_pretrained THUDM/cogcom-base-17b-hf --bf16 --local_tokenizer path/to/tokenizer --bf16 --english". This I believe it is for inference. Is there any way I could finetune the model in INT4 like calling the model from hugging face hub. If I copy paste the model ID, I am not able to find the model card in hugging face. It will be helpful if you could guide me on it.

@qijimrc
Copy link
Collaborator

qijimrc commented Feb 20, 2024

I am not able to find the model on Hugging Face. Can you help me out how to find the model in Hugging Face? Also Gradio is not working for me as well

Hi, please get the updated links of our HuggingFace models in the README. For the Gradio demo, as I answered in this Issue, you may follow the requirements.txt to ensure the proper versions of pydantic and gradio (can use --no-deps to ignore conflicts).

I checked the readme. where I got this line of code "python cli_demo_hf.py --from_pretrained THUDM/cogcom-base-17b-hf --bf16 --local_tokenizer path/to/tokenizer --bf16 --english". This I believe it is for inference. Is there any way I could finetune the model in INT4 like calling the model from hugging face hub. If I copy paste the model ID, I am not able to find the model card in hugging face. It will be helpful if you could guide me on it.

Hi, the trained models can be manually downloaded using these links, and you may finetune or inference based on these models by specifying the argument of --from_pretrained=/path/to/unzipped_model_folder. Currently, we don't support automatic model loading from the Transformers library, but we'll work on supporting it soon. You can easily use the SAT for model training, and all configurations (model parallelism, depspped optimization, LoRA, INT2/4 Quantization for inference etc.) can be set via arguments (try the finetune.sh I have prepared in our repo).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants