-
Notifications
You must be signed in to change notification settings - Fork 755
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting Warning: skip broken file #1139
Comments
Hi @balandongiv |
Thanks for replying @Mountchicken . The images were stored as followed
and
The config is as below
Update Remove the
also result into similar |
Hi @Mountchicken , I notice this issue also happen when I am using the In summary here the steps and with the following config
The train the model produce the following warning/error
|
@balandongiv
|
Thanks for replying @Mountchicken , appreciate it (its really like waiting for Santa). anyhow, I did as recommend. but, the issues still persist and can be reproduce via this notebook. The file/folder structure is as below
The
|
Have you tested it on a GPU machine? I am able to train the model with your config without any trouble on my machine, but on colab the process does fail on the transform. Maybe there is some minor difference we were not able to realize. |
Thanks for responding @gaotongxiao., Right now, I am having issue installing the May I know what do you mean by
Is it from my side or the setting within the |
I'd suggest you install pip install mim
mim install mmcv-full And
I was not able to identify the reason as well, and the only clue that I have is that your config actually works well locally. |
Thanks for replying @gaotongxiao , Actually I have installed using the recommended approach. But I got an Similarly, when running the the following in my local
it produce
I think the issue boil up at |
Hi @gaotongxiao & @Mountchicken , I now can confirm the only occur in Google Colab but not in local machine. Thanks for your time |
While train the
dbnetpp
, the compiler returnWarning: skip broken file
. Unfortunately, this happen for almost all theimages
. Given limited dataset, I try my best to utilise max images possible.May I know how to resolve this issue?
The full trace-back is as below:
In addition, store the image directly under
colab
directory produce similar issueIs the issue related to here?
The text was updated successfully, but these errors were encountered: