Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chatglm3在进行ceval 评估时 找不到eostoken #2353

Closed
1 task done
zhangtianhong-1998 opened this issue Jan 26, 2024 · 1 comment
Closed
1 task done

chatglm3在进行ceval 评估时 找不到eostoken #2353

zhangtianhong-1998 opened this issue Jan 26, 2024 · 1 comment
Labels
duplicate This issue or pull request already exists

Comments

@zhangtianhong-1998
Copy link

Reminder

  • I have read the README and searched the existing issues.

Reproduction

CUDA_VISIBLE_DEVICES=0 python src/evaluate.py
--model_name_or_path chatglm_Merge/chatglm_log_8_16 \ (合并后的模型)
--template vanilla
--task ceval
--split validation
--lang zh
--n_shot 5
--batch_size 4

Expected behavior

No response

System Info

[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file tokenizer.model
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file added_tokens.json
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file special_tokens_map.json
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file tokenizer_config.json
[INFO|tokenization_utils_base.py:2024] 2024-01-26 21:55:28,354 >> loading file tokenizer.json
Traceback (most recent call last):
File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/evaluate.py", line 10, in
main()
File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/evaluate.py", line 5, in main
evaluator = Evaluator()
File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/llmtuner/eval/evaluator.py", line 25, in init
self.model, self.tokenizer = load_model_and_tokenizer(self.model_args, finetuning_args)
File "/media/hp/3E9AFE709AFE2455/zth/LLM/LLaMA-Factory-main/src/llmtuner/model/loader.py", line 49, in load_model_and_tokenizer
tokenizer = AutoTokenizer.from_pretrained(
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 774, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained
return cls._from_pretrained(
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/hp/.cache/huggingface/modules/transformers_modules/chatglm_log_8_16/tokenization_chatglm.py", line 93, in init
super().init(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces, **kwargs)
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 363, in init
super().init(**kwargs)
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1602, in init
super().init(**kwargs)
File "/home/hp/.pyenv/versions/anaconda3-5.3.0/envs/Factory/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 861, in init
setattr(self, key, value)
AttributeError: can't set attribute 'eos_token'

Others

No response

@hiyouga
Copy link
Owner

hiyouga commented Jan 29, 2024

#1307 (comment)

@hiyouga hiyouga added the duplicate This issue or pull request already exists label Jan 29, 2024
@hiyouga hiyouga closed this as completed Jan 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
duplicate This issue or pull request already exists
Projects
None yet
Development

No branches or pull requests

2 participants