You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/home/llm/pretrain/LLaMA-Factory/src/train_bash.py", line 14, in <module>
main()
File "/home/llm/pretrain/LLaMA-Factory/src/train_bash.py", line 5, in main
run_exp()
File "/home/llm/pretrain/LLaMA-Factory/src/llmtuner/tuner/tune.py", line 26, in run_exp
run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
File "/home/llm/pretrain/LLaMA-Factory/src/llmtuner/tuner/sft/workflow.py", line 28, in run_sft
model, tokenizer = load_model_and_tokenizer(model_args, finetuning_args, training_args.do_train, stage="sft")
File "/home/llm/pretrain/LLaMA-Factory/src/llmtuner/tuner/core/loader.py", line 71, in load_model_and_tokenizer
tokenizer = AutoTokenizer.from_pretrained(
File "/home/bjkj3/miniconda3/envs/py39-lora-torch_2.0.1/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 738, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/bjkj3/miniconda3/envs/py39-lora-torch_2.0.1/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2017, in from_pretrained
return cls._from_pretrained(
File "/home/bjkj3/miniconda3/envs/py39-lora-torch_2.0.1/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2249, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/bjkj3/.cache/huggingface/modules/transformers_modules/ChatGLM3-6B-yuangong_shouce-pretrain/tokenization_chatglm.py", line 93, in __init__
super().__init__(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces, **kwargs)
File "/home/bjkj3/miniconda3/envs/py39-lora-torch_2.0.1/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 363, in __init__
super().__init__(**kwargs)
File "/home/bjkj3/miniconda3/envs/py39-lora-torch_2.0.1/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 1604, in __init__
super().__init__(**kwargs)
File "/home/bjkj3/miniconda3/envs/py39-lora-torch_2.0.1/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 861, in __init__
setattr(self, key, value)
AttributeError: can't set attribute
The text was updated successfully, but these errors were encountered:
增量预训练脚本:
导出预训练为独立模型:
微调脚本:
报错信息:
The text was updated successfully, but these errors were encountered: