We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CUDA_VISIBLE_DEVICES=0 python src/evaluate.py --model_name_or_path chatglm_Merge/chatglm_log_8_16 \ (合并后的模型) --template vanilla --task ceval --split validation --lang zh --n_shot 5 --batch_size 4
No response
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file tokenizer.model [INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file added_tokens.json [INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file special_tokens_map.json [INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file tokenizer_config.json [INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file tokenizer.json Traceback (most recent call last): File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/evaluate.py", line 10, in main() File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/evaluate.py", line 5, in main evaluator = Evaluator() File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/llmtuner/eval/evaluator.py", line 25, in init self.model, self.tokenizer = load_model_and_tokenizer(self.model_args, finetuning_args) File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/llmtuner/model/loader.py", line 49, in load_model_and_tokenizer tokenizer = AutoTokenizer.from_pretrained( File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 774, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained return cls._from_pretrained( File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/home/hp/.cache/huggingface/modules/transformers_modules/chatglm_log_8_16/tokenization_chatglm.py", line 93, in init super().init(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces, **kwargs) File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 363, in init super().init(**kwargs) File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1602, in init super().init(**kwargs) File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 861, in init setattr(self, key, value) AttributeError: can't set attribute 'eos_token'
The text was updated successfully, but these errors were encountered:
#1307 (comment)
Sorry, something went wrong.
No branches or pull requests
Reminder
Reproduction
CUDA_VISIBLE_DEVICES=0 python src/evaluate.py
--model_name_or_path chatglm_Merge/chatglm_log_8_16 \ (合并后的模型)
--template vanilla
--task ceval
--split validation
--lang zh
--n_shot 5
--batch_size 4
Expected behavior
No response
System Info
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file tokenizer.model
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file added_tokens.json
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file special_tokens_map.json
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file tokenizer_config.json
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file tokenizer.json
Traceback (most recent call last):
File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/evaluate.py", line 10, in
main()
File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/evaluate.py", line 5, in main
evaluator = Evaluator()
File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/llmtuner/eval/evaluator.py", line 25, in init
self.model, self.tokenizer = load_model_and_tokenizer(self.model_args, finetuning_args)
File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/llmtuner/model/loader.py", line 49, in load_model_and_tokenizer
tokenizer = AutoTokenizer.from_pretrained(
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 774, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained
return cls._from_pretrained(
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/hp/.cache/huggingface/modules/transformers_modules/chatglm_log_8_16/tokenization_chatglm.py", line 93, in init
super().init(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces, **kwargs)
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 363, in init
super().init(**kwargs)
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1602, in init
super().init(**kwargs)
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 861, in init
setattr(self, key, value)
AttributeError: can't set attribute 'eos_token'
Others
No response
The text was updated successfully, but these errors were encountered: