-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Empty gaps during semantic segmentation labelling #4868
Comments
Hi, @venki-lfc We are working on integration Paint & Brush tools to CVAT, to support not only Polygonal masks, but also pixel-based masks. |
Thank you for your comment! |
My actions before raising this issue
Good Day,
I am using DeepLabv3 Pytorch model for a semantic segmentation task and I would like to integrate this model to the backend of CVAT so that I can perform semi-automated labelling on new images. Let's just say the trained DeepLabv3 model gives an RGB-Mask output (same height and width as input). If I need to use this prediction in CVAT, I need to convert from RGB pixel-mask to polygon-mask. In order to find the polygons, we perform OpenCV's
findContours
function. But since these polygons are approximated, we'll have gaps in between them as shown below.Sometimes these gaps could be so small that the human labeler might not notice them during validation. This is obviously a problem when training a new model with such labels. Semantic segmentation models expect all the pixels to have a class assigned to them. This is not the case here.
My question is, does CVAT have any function that I could use in order to prevent these gaps from appearing when converting from pixel-based mask to polygon-based mask? If not, do you have any tips on how to avoid this issue?
Thanks a lot!
Steps to Reproduce (for bugs)
Context
Trying to generate the semi-automated labels for semantic segmentation without any "empty spaces". This is because the semantic segmentation models require every pixel to have one class assigned to them.
Your Environment
The text was updated successfully, but these errors were encountered: