Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error while setting up Pytorch based DL model as serverless function for auto annotation #6895

Closed
NamalJayasuriya opened this issue Sep 24, 2023 · 1 comment · Fixed by #7047

Comments

@NamalJayasuriya
Copy link

My actions before raising this issue

I'm trying to use auto annotation for a project with my already trained DL model.
I deployed CVAT v2.7.1 using the github sourcecode.
My CVAT setup is working fine and I followed the Serverless turotial (https://opencv.github.io/cvat/docs/manual/advanced/serverless-tutorial/#dl-model-as-a-serverless-function) from your documentation. I wanted to use with GPU , however I tried with cpu-first to make sure that everything is working fine.

Projects under Openvino (omz-public-yolo-v3, omz-public-mask-rcnn) works fine and I was able to try the auto annotation. tf-faster-rcnn example also worked well.
But, anything under pytorch did not worked for me. I tried already existing retinanet_r101 in github @v2.7.1. Also tried with the codes given at the above mentioned serverless tutorial.

Error got while trying pytorch-facebookresearch-detectron2-retinanet-r101

steps before error

All the containers and docker images remain due to previous tries are are killed and run docker system prune as well.
Make sure sure CVAT is working fine with other models.
serverless/deploy_cpu.sh serverless/tensorflow/faster_rcnn_inception_v2_coco/ worked well and tried auto annotation.
Then I stopped faster rcnn using nuclio dashboard at localhost:8070, also checked running containers (screen shot attached).
image

Then run serverless/deploy_cpu.sh serverless/pytorch/facebookresearch/detectron2/retinanet_r101/ and got following errors. Two screen shots attached showing start and end of error output as it is too long.
image
image

I also tried the code given in tutorial mentioned above as well and ended up with something similar.
It will great if you can support to resolve the errors.

My Environment

  • Ubuntu 22.04 6.2.0-33-generic
  • Docker version 24.0.6, build ed223bc
@alan30408
Copy link

@NamalJayasuriya, I also got this issue. Is there any update here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants