Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

昇腾NPU推理glm4-9b-chat出现'NoneType' object has no attribute 'do_sample' #4858

Closed
1 task done
AlexYoung757 opened this issue Jul 17, 2024 · 2 comments
Closed
1 task done
Labels
npu This problem is related to NPU devices solved This problem has been already solved

Comments

@AlexYoung757
Copy link

AlexYoung757 commented Jul 17, 2024

Reminder

  • I have read the README and searched the existing issues.

System Info

  • Platform: Linux-4.19.36-vhulk1907.1.0.h619.eulerosv2r8.aarch64-aarch64-with-glibc2.28
  • Python version: 3.10.14
  • PyTorch version: 2.2.0 (NPU)
  • Transformers version: 4.41.2
  • Datasets version: 2.20.0
  • Accelerate version: 0.32.1
  • PEFT version: 0.11.1
  • TRL version: 0.9.6
  • NPU type: Ascend910PremiumA
  • CANN version: 8.0.RC1.alpha003

Reproduction

ASCEND_RT_VISIBLE_DEVICES=4 llamafactory-cli chat glm4_9b_chat.yaml

Expected behavior

Traceback (most recent call last):
  File "/root/miniconda3/envs/llm/bin/llamafactory-cli", line 8, in <module>
    sys.exit(main())
  File "/root/project/LLaMA-Factory/src/llamafactory/cli.py", line 81, in main
    run_chat()
  File "/root/project/LLaMA-Factory/src/llamafactory/chat/chat_model.py", line 125, in run_chat
    chat_model = ChatModel()
  File "/root/project/LLaMA-Factory/src/llamafactory/chat/chat_model.py", line 44, in __init__
    self.engine: "BaseEngine" = HuggingfaceEngine(model_args, data_args, finetuning_args, generating_args)
  File "/root/project/LLaMA-Factory/src/llamafactory/chat/hf_engine.py", line 58, in __init__
    self.model = load_model(
  File "/root/project/LLaMA-Factory/src/llamafactory/model/loader.py", line 159, in load_model
    patch_model(model, tokenizer, model_args, is_trainable, add_valuehead)
  File "/root/project/LLaMA-Factory/src/llamafactory/model/patcher.py", line 120, in patch_model
    if not gen_config.do_sample and (
AttributeError: 'NoneType' object has no attribute 'do_sample'

Others

No response

@github-actions github-actions bot added pending This problem is yet to be addressed npu This problem is related to NPU devices labels Jul 17, 2024
@hiyouga
Copy link
Owner

hiyouga commented Jul 17, 2024

不要修改模型文件

@hiyouga hiyouga added solved This problem has been already solved and removed pending This problem is yet to be addressed labels Jul 17, 2024
@hiyouga hiyouga closed this as completed Jul 17, 2024
@AlexYoung757
Copy link
Author

AlexYoung757 commented Jul 17, 2024

模型文件按照 #4388 修改的。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
npu This problem is related to NPU devices solved This problem has been already solved
Projects
None yet
Development

No branches or pull requests

2 participants