Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"NoneType object is not callable" on stopping P2P #237

Closed
slush0 opened this issue Jan 25, 2023 · 10 comments · Fixed by learning-at-home/hivemind#579
Closed

"NoneType object is not callable" on stopping P2P #237

slush0 opened this issue Jan 25, 2023 · 10 comments · Fixed by learning-at-home/hivemind#579
Labels
1day A problem that can be solved in a single day's time bug Something isn't working good first issue Good for newcomers help wanted Collaborators needed

Comments

@slush0
Copy link

slush0 commented Jan 25, 2023

I have very simple inference testing script. No threading or any advanced stuff. Basically "hello world" inference on Petals. Everything is going well, but when the script is exiting, I always get this error:

Exception ignored in: <function P2P.__del__ at 0x7f4ac1feed40>
Traceback (most recent call last):
  File "/home/dev/.local/lib/python3.10/site-packages/hivemind/p2p/p2p_daemon.py", line 632, in __del__
  File "/home/dev/.local/lib/python3.10/site-packages/hivemind/p2p/p2p_daemon.py", line 659, in _terminate
  File "/home/dev/.local/lib/python3.10/site-packages/multiaddr/multiaddr.py", line 254, in value_for_protocol
TypeError: 'NoneType' object is not callable

It is rather cosmetic issue, but something is not OK there.

@borzunov
Copy link
Collaborator

@slush0 Thanks for reporting! That's a bug in hivemind, I'll it fix it :)

@slush0
Copy link
Author

slush0 commented Jan 25, 2023

For report clarity, this is the failing script.

$ cat test2.py 
#!/usr/bin/env python
import torch
from transformers import BloomTokenizerFast
from petals import DistributedBloomForCausalLM

dev = 'cpu'
prompt = """Lorem ipsum dolor sit amet"""
model = DistributedBloomForCausalLM.from_pretrained("bigscience/bloomz-petals").to(dev)
tokenizer = BloomTokenizerFast.from_pretrained("bigscience/bloomz-petals")
inputs = tokenizer(prompt, return_tensors="pt").input_ids.to(dev)
outputs = model.generate(inputs,
            max_new_tokens=40,
            temperature=0.00001,
            repetition_penalty=1.2)
print(tokenizer.decode(outputs[0]))

@borzunov borzunov added help wanted Collaborators needed bug Something isn't working good first issue Good for newcomers labels Mar 8, 2023
@borzunov
Copy link
Collaborator

borzunov commented Mar 8, 2023

Update: I still haven't fixed this bug, since I didn't understand its cause after a short investigation and switch to tasks with higher priority. I'd appreciate if someone (including new contributors) looks into it.

@borzunov borzunov added the 1day A problem that can be solved in a single day's time label Mar 9, 2023
@borzunov
Copy link
Collaborator

borzunov commented May 9, 2023

@slush0 I can't reproduce this bug in the latest versions of hivemind and Petals, I assume it could have been resolved in hivemind/Petals or their dependencies. I'll close this issue, but please let me know if this bug still happens in your environment (even when you update hivemind & Petals).

@borzunov borzunov closed this as completed May 9, 2023
@borzunov
Copy link
Collaborator

borzunov commented May 9, 2023

Please also note that the repetition_penalty argument was ignored in our code due to a bug. Actually, this parameter is not implemented yet in the main branch of Petals. Please let me know if you need it, since I have a separate (almost finished) branch with this parameter (if you do, I'll finish and merge it).

@borzunov
Copy link
Collaborator

The issue is still relevant: #368 (comment)

@borzunov borzunov reopened this Jul 22, 2023
@borzunov
Copy link
Collaborator

Fixing it in learning-at-home/hivemind#579

@ivangabriele
Copy link

ivangabriele commented Jul 22, 2023

Understood! Yes from what I understand there is no concept of multi-version installation in pip.

Thank you for the quick reply and PR, I'll wait for the merge and next release of hivemind then ^^.

@borzunov
Copy link
Collaborator

borzunov commented Jul 22, 2023

@ivangabriele You don't have to wait for the next release, since the Petals main branch depends on the latest master branch of hivemind.

I've merged the hivemind fix, so now you can just upgrade petals and the bug should be gone:

pip install --upgrade git+https://github.com/bigscience-workshop/petals

Let us know if you meet any other issues!

mryab pushed a commit to learning-at-home/hivemind that referenced this issue Jul 23, 2023
@ivangabriele
Copy link

ivangabriele commented Jul 23, 2023

@borzunov I didn't reply yesterday but I just confirm here that your hivemind deps change fixed my issue ^^.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
1day A problem that can be solved in a single day's time bug Something isn't working good first issue Good for newcomers help wanted Collaborators needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants