We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
when i run code above,promblem happened:
Traceback (most recent call last): File "/root/MixLoRA/test.py", line 30, in generate_ids = model.generate(inputs.input_ids, max_length=30) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/generation/utils.py", line 2024, in generate result = self._sample( ^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/generation/utils.py", line 2982, in _sample outputs = self(**model_inputs, return_dict=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 1189, in forward outputs = self.model( ^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 1001, in forward layer_outputs = decoder_layer( ^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 750, in forward hidden_states = self.mlp(hidden_states) ^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1553, in wrapped_call_impl return self.call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1562, in call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/MixLoRA/mixlora/model.py", line 210, in forward self.gate = self.gate.to(hidden_states) ^^^^^^^^^^ File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1749, in setattr raise TypeError(f"cannot assign '{torch.typename(value)}' as parameter '{name}' " TypeError: cannot assign 'torch.FloatTensor' as parameter 'gate' (torch.nn.Parameter or None expected)
what‘s wrong with my code or how i could do to generate with mixlora?
The text was updated successfully, but these errors were encountered:
Which version you have used for this code? I can't find the corresponded codes for self.gate = self.gate.to(hidden_states).
self.gate = self.gate.to(hidden_states)
Sorry, something went wrong.
mixlora 0.2.2
this mixlora version is installed by "pip3 install mixlora"
No branches or pull requests
when i run code above,promblem happened:
Traceback (most recent call last):
File "/root/MixLoRA/test.py", line 30, in
generate_ids = model.generate(inputs.input_ids, max_length=30)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/generation/utils.py", line 2024, in generate
result = self._sample(
^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/generation/utils.py", line 2982, in _sample
outputs = self(**model_inputs, return_dict=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 1189, in forward
outputs = self.model(
^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 1001, in forward
layer_outputs = decoder_layer(
^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 750, in forward
hidden_states = self.mlp(hidden_states)
^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1553, in wrapped_call_impl
return self.call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1562, in call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/MixLoRA/mixlora/model.py", line 210, in forward
self.gate = self.gate.to(hidden_states)
^^^^^^^^^^
File "/root/miniconda3/envs/mixlora/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1749, in setattr
raise TypeError(f"cannot assign '{torch.typename(value)}' as parameter '{name}' "
TypeError: cannot assign 'torch.FloatTensor' as parameter 'gate' (torch.nn.Parameter or None expected)
what‘s wrong with my code or how i could do to generate with mixlora?
The text was updated successfully, but these errors were encountered: