You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to launch a private swarm using a model I have downloaded to a local directory, which I think in theory should work fine. However, it presents an error when I try to run the command to start the first server.
This is what I'm getting.
python -m petals.cli.run_server lzlv --new_swarm
Dec 21 19:41:15.890 [INFO] Running Petals 2.3.0.dev1
Dec 21 19:41:16.231 [INFO] Make sure you follow the LLaMA's terms of use: https://bit.ly/llama2-license for LLaMA 2, >https://bit.ly/llama-license for LLaMA 1
Traceback (most recent call last):
File "/opt/conda/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/opt/conda/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/opt/conda/lib/python3.10/site-packages/petals/cli/run_server.py", line 235, in
main()
File "/opt/conda/lib/python3.10/site-packages/petals/cli/run_server.py", line 219, in main
server = Server(
File "/opt/conda/lib/python3.10/site-packages/petals/server/server.py", line 119, in init
assert UID_DELIMITER not in dht_prefix and CHAIN_DELIMITER not in dht_prefix, (
TypeError: argument of type 'NoneType' is not iterable
My setup using the same command works fine when I launch it using the huggingface path for the same model. The folder I reference on my machine contains the model downloaded right from huggingface including config.json.
The text was updated successfully, but these errors were encountered:
Hi,
I am trying to launch a private swarm using a model I have downloaded to a local directory, which I think in theory should work fine. However, it presents an error when I try to run the command to start the first server.
This is what I'm getting.
My setup using the same command works fine when I launch it using the huggingface path for the same model. The folder I reference on my machine contains the model downloaded right from huggingface including config.json.
The text was updated successfully, but these errors were encountered: