Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inference problem #144

Open
Maomaoxion opened this issue Oct 18, 2023 · 0 comments
Open

inference problem #144

Maomaoxion opened this issue Oct 18, 2023 · 0 comments

Comments

@Maomaoxion
Copy link

Exception in thread Thread-7:
Traceback (most recent call last):
File "/data/starCoder/software/conda/envs/torch/lib/python3.8/threading.py", line 932, in _bootstrap_inner
self.run()
File "/data/starCoder/software/conda/envs/torch/lib/python3.8/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "/data/starCoder/software/conda/envs/torch/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/data/starCoder/software/conda/envs/torch/lib/python3.8/site-packages/transformers/generation/utils.py", line 1648, in generate
return self.sample(
File "/data/starCoder/software/conda/envs/torch/lib/python3.8/site-packages/transformers/generation/utils.py", line 2730, in sample
outputs = self(
File "/data/starCoder/software/conda/envs/torch/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/data/starCoder/software/conda/envs/torch/lib/python3.8/site-packages/transformers/models/gpt_bigcode/modeling_gpt_bigcode.py", line 808, in forward
transformer_outputs = self.transformer(
File "/data/starCoder/software/conda/envs/torch/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/data/starCoder/software/conda/envs/torch/lib/python3.8/site-packages/transformers/models/gpt_bigcode/modeling_gpt_bigcode.py", line 605, in forward
self_attention_mask = self_attention_mask * attention_mask.view(batch_size, 1, -1).to(
RuntimeError: The size of tensor a (8192) must match the size of tensor b (8193) at non-singleton dimension 2

When I engage in long conversations, an error occurs halfway through. how to fix it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant