-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: high <= 0 #12
Comments
your len(indices) < = 0 |
Hello, I encountered the same error while running my dataset. Is this the reason for my dataset? |
Maybe, because I changed the dataset and found that the model ran successfully. But I don't really know why the previous dataset would cause this. |
In your free time, could you tell me what usually causes this error to occur? At the moment I don't have any clues to fix this error, your advice is very valuable to me. |
The reason I found the problem seems to be because my dataset is a multi-target segmentation dataset. The source code indicates that it will transform the labeled images into 'L' for indices = np.argwhere(mask == 1). But the label image of the multi-target segmentation dataset will not be simply white after transforming it to 'L'. So I am considering how to make this model work for multi-target segmentation. |
@nanshanvv check my BTCV example case, you need to transfer each target to a binary 2d map |
Thank you for your reply, it was very useful for me |
Hello~ Using your own data set is not very large, 50MB, |
您好~打扰啦。请问您的训练显存是多大呢?用的什么型号GPU呢? |
Hello~ Excuse me. How much is your training video memory? What type of GPU do you use? |
50A 48GB |
我这是a40 45GB 自己的数据集不大,改了bs和numberwork,几乎是瞬报显存,请问您是怎么解决的呀,方便留个邮箱吗~ |
This is a40 45GB. My data set is not big. I changed the bs and numberwork, and it is almost instantaneous video memory. How did you solve it? Can you leave me an email~ |
50A的话我把batch_size调到4是可以用的。 |
If it is 50A, I can adjust the batch_size to 4 and it will work. |
hi where is your BTCV example case? |
The following error occurs when I use my own dataset for training. Can you please tell me what is causing it?
The text was updated successfully, but these errors were encountered: