Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Guideline for generating correct lit_model.pth #9

Closed
mkeya013 opened this issue Sep 4, 2024 · 9 comments
Closed

Guideline for generating correct lit_model.pth #9

mkeya013 opened this issue Sep 4, 2024 · 9 comments

Comments

@mkeya013
Copy link

mkeya013 commented Sep 4, 2024

Dear Author, is it possible to provide correct lit_model.pth file? Or a guideline to generate correct lit_model.pth file would be a great help.

Thank you.

@LinghaoChan
Copy link
Collaborator

LinghaoChan commented Sep 4, 2024

@mkeya013 Thanks for your question.

  1. git clone the `lit-gpt' and go into the corresponding dictionary.
git clone https://github.com/Lightning-AI/litgpt.git
cd litgpt
git checkout d78730a0694bff6c9f1ce285a6e8b471c1321cf5
cd ./litgpt/scripts

To see all the available checkpoints for Vicuna, run:

python scripts/download.py | grep vicuna

which will print

lmsys/vicuna-7b-v1.3
lmsys/vicuna-13b-v1.3
lmsys/vicuna-33b-v1.3
lmsys/vicuna-7b-v1.5
lmsys/vicuna-7b-v1.5-16k
lmsys/vicuna-13b-v1.5
lmsys/vicuna-13b-v1.5-16k
  1. Download the model.

In order to use a specific Vicuna checkpoint, for instance vicuna-7b-v1.5, download the weights and convert the checkpoint to the lit-gpt format:

pip install huggingface_hub

python scripts/download.py --repo_id lmsys/vicuna-7b-v1.5

python scripts/convert_hf_checkpoint.py --checkpoint_dir checkpoints/lmsys/vicuna-7b-v1.5

By default, the convert_hf_checkpoint step will use the data type of the HF checkpoint's parameters. In cases where RAM
or disk size is constrained, it might be useful to pass --dtype bfloat16 to convert all parameters into this smaller precision before continuing.

You're done! To execute the model just run:

pip install sentencepiece

python chat/base.py --checkpoint_dir checkpoints/lmsys/vicuna-7b-v1.5

This is the vicuna version I used. You can also follow the latest official guideline and revise it to vicuna-7b-1.5.

@mkeya013
Copy link
Author

mkeya013 commented Sep 8, 2024

It worked. Thank you.

@mkeya013 mkeya013 closed this as completed Sep 8, 2024
LinghaoChan added a commit that referenced this issue Sep 8, 2024
@CMY-CTO
Copy link

CMY-CTO commented Oct 21, 2024

Dear Author,

For Step1: git clone the `lit-gpt' and go into the corresponding dictionary.

It's not workin
image

Is it possible to provide correct repository?

Thank you.

@LinghaoChan
Copy link
Collaborator

@CMY-CTO sry for this. plz revise as

git clone https://github.com/Lightning-AI/litgpt.git
cd litgpt
git checkout d78730a0694bff6c9f1ce285a6e8b471c1321cf5

@LinghaoChan
Copy link
Collaborator

@LinghaoChan 你好,我按照这个流程操作的时候报错 from litgpt.scripts.convert_hf_checkpoint import convert_hf_checkpoint ModuleNotFoundError: No module named 'litgpt' 请问该怎么修改啊,从huggingface或是modelscope的模型库中好像都有没有lit_model.pth这个模型

@ransheng11 需要按照README所说下载vicuna-1.5-7B的model哦~~~

@LinghaoChan
Copy link
Collaborator

@LinghaoChan 你好,我按照这个流程操作的时候报错 from litgpt.scripts.convert_hf_checkpoint import convert_hf_checkpoint ModuleNotFoundError: No module named 'litgpt' 请问该怎么修改啊,从huggingface或是modelscope的模型库中好像都有没有lit_model.pth这个模型

@ransheng11 需要按照README所说下载vicuna-1.5-7B的model哦~~~

image 你好,谢谢你的及时回答,我的问题可能有点白痴了,我现在仍然没有运行成功 1.python scripts/download.py --repo_id lmsys/vicuna-7b-v1.5 2.python scripts/convert_hf_checkpoint.py --checkpoint_dir checkpoints/lmsys/vicuna-7b-v1.5 在运行第一个命令的时候报如下错误,我用其他方式下载了模型如图 raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: (ProtocolError('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer')), '(Request ID: c301ebb9-2ae9-44b6-8071-74d95a5a9ad4)') 运行第二个命令的时候报如下错误,但我在huggingface和modelscope都没有看到有这个文件 FileNotFoundError: [Errno 2] No such file or directory: 'checkpoints/lmsys/vicuna-7b-v1.5/model_config.yaml'

Please refer to the guidance provided by lig-gpt.

@ransheng11
Copy link

@LinghaoChan Thanks. I think I have followed the guidelines of litgpt. But it's okay. I have downloaded the model from hf. But now I get an out-of-memory error when running cli. I have 24GB of memory. Does motionllm support multiple cards? And does it support fine-tuning with custom datasets?

@JackYu6
Copy link

JackYu6 commented Jan 21, 2025

After running

python scripts/download.py --repo_id lmsys/vicuna-7b-v1.5

I encountered the following error:

Traceback (most recent call last):
  File "/home/user/miniconda3/lib/python3.11/site-packages/litgpt/config.py", line 121, in from_name
    conf_dict = next(
                ^^^^^
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/user/litgpt/litgpt/scripts/download.py", line 137, in <module>
    CLI(download_from_hub)
  File "/home/user/miniconda3/lib/python3.11/site-packages/litgpt/utils.py", line 466, in CLI
    return CLI(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/user/miniconda3/lib/python3.11/site-packages/jsonargparse/_cli.py", line 96, in CLI
    return _run_component(components, init)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/miniconda3/lib/python3.11/site-packages/jsonargparse/_cli.py", line 204, in _run_component
    return component(**cfg)
           ^^^^^^^^^^^^^^^^
  File "/home/user/litgpt/litgpt/scripts/download.py", line 107, in download_from_hub
    convert_hf_checkpoint(checkpoint_dir=directory, dtype=dtype, model_name=model_name)
  File "/home/user/miniconda3/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/miniconda3/lib/python3.11/site-packages/litgpt/scripts/convert_hf_checkpoint.py", line 524, in convert_hf_checkpoint
    config = Config.from_name(model_name)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/miniconda3/lib/python3.11/site-packages/litgpt/config.py", line 128, in from_name
    raise ValueError(f"{name!r} is not a supported config name")
ValueError: 'vicuna-7b-v1.5' is not a supported config name

How can I fix it? Thx.

@conc1se333
Copy link

After running

python scripts/download.py --repo_id lmsys/vicuna-7b-v1.5

I encountered the following error:

Traceback (most recent call last):
  File "/home/user/miniconda3/lib/python3.11/site-packages/litgpt/config.py", line 121, in from_name
    conf_dict = next(
                ^^^^^
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/user/litgpt/litgpt/scripts/download.py", line 137, in <module>
    CLI(download_from_hub)
  File "/home/user/miniconda3/lib/python3.11/site-packages/litgpt/utils.py", line 466, in CLI
    return CLI(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/user/miniconda3/lib/python3.11/site-packages/jsonargparse/_cli.py", line 96, in CLI
    return _run_component(components, init)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/miniconda3/lib/python3.11/site-packages/jsonargparse/_cli.py", line 204, in _run_component
    return component(**cfg)
           ^^^^^^^^^^^^^^^^
  File "/home/user/litgpt/litgpt/scripts/download.py", line 107, in download_from_hub
    convert_hf_checkpoint(checkpoint_dir=directory, dtype=dtype, model_name=model_name)
  File "/home/user/miniconda3/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/miniconda3/lib/python3.11/site-packages/litgpt/scripts/convert_hf_checkpoint.py", line 524, in convert_hf_checkpoint
    config = Config.from_name(model_name)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/miniconda3/lib/python3.11/site-packages/litgpt/config.py", line 128, in from_name
    raise ValueError(f"{name!r} is not a supported config name")
ValueError: 'vicuna-7b-v1.5' is not a supported config name

How can I fix it? Thx.

I ran into the same issue. Did you solve it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants