Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

在安装完Gradio之后,执行python web_demo.py时出错。 请高手指点一下 #904

Open
1 task done
hunkren opened this issue May 4, 2023 · 5 comments
Open
1 task done

Comments

@hunkren
Copy link

hunkren commented May 4, 2023

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

在安装完Gradio之后,执行python web_demo.py时出错。
请高手指点一下
E:\chatGLM\ChatGLM-6B>python web_demo.py
Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Traceback (most recent call last):
File "C:\Users\中文用户名\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\tokenization_utils_base.py", line 1965, in from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "C:\Users\中文用户名/.cache\huggingface\modules\transformers_modules\THUDM\chatglm-6b\658202d88ac4bb782b99e99ac3adff58b4d0b813\tokenization_chatglm.py", line 209, in init
self.sp_tokenizer = SPTokenizer(vocab_file, num_image_tokens=num_image_tokens)
File "C:\Users\中文用户名/.cache\huggingface\modules\transformers_modules\THUDM\chatglm-6b\658202d88ac4bb782b99e99ac3adff58b4d0b813\tokenization_chatglm.py", line 61, in init
self.text_tokenizer = TextTokenizer(vocab_file)
File "C:\Users\中文用户名/.cache\huggingface\modules\transformers_modules\THUDM\chatglm-6b\658202d88ac4bb782b99e99ac3adff58b4d0b813\tokenization_chatglm.py", line 22, in init
self.sp.Load(model_path)
File "C:\Users\中文用户名\AppData\Local\Programs\Python\Python310\lib\site-packages\sentencepiece_init
.py", line 905, in Load
return self.LoadFromFile(model_file)
File "C:\Users\中文用户名\AppData\Local\Programs\Python\Python310\lib\site-packages\sentencepiece_init_.py", line 310, in LoadFromFile
return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg)
OSError: Not found: "C:\Users\中文用户名/.cache\huggingface\hub\models--THUDM--chatglm-6b\snapshots\658202d88ac4bb782b99e99ac3adff58b4d0b813\ice_text.model": No such file or directory Error #2

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "E:\chatGLM\ChatGLM-6B\web_demo.py", line 4, in
tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True)
File "C:\Users\中文用户名\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 702, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "C:\Users\中文用户名\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\tokenization_utils_base.py", line 1811, in from_pretrained
return cls._from_pretrained(
File "C:\Users\中文用户名\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\tokenization_utils_base.py", line 1967, in _from_pretrained
raise OSError(
OSError: Unable to load vocabulary from file. Please check that the provided vocabulary is accessible and not corrupted.

Expected Behavior

No response

Steps To Reproduce

win11,在安装了pytroch11.8,CUDA,Gradio之后,执行python web_demo.py

Environment

- OS:win11
- Python:python3.10
- Transformers:
- PyTorch:11.8
- CUDA Support :true

Anything else?

No response

@invado
Copy link

invado commented May 4, 2023

THUDM/chatglm-6b模型应该是没下载成功。下载到本地并替换正确路径试试。

@sliontc
Copy link

sliontc commented Jun 9, 2023

已经在本地了,还是运行不成功。

@xmlv
Copy link

xmlv commented Jun 11, 2023

先别改代码,试试默认下载地址能不能正常跑。
如果可以那就是你自己下到本地的文件不完整或者代码里面改错了。

@mainxx
Copy link

mainxx commented Jun 25, 2023

差不多的问题。模型已经下载了。
tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True)
model = AutoModel.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True, device='cuda')

修改为本地路径。都试过了。好像也是不行,是不是windows环境运行是有问题的。

@xmlv
Copy link

xmlv commented Jun 28, 2023

报的错发出来看看,这个我部了几台window机器,应该没问题的

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants