Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issues with Transferring GIGA to Custom Dataset and Training Performance on Smaller Scales #36

Open
Fanqyu opened this issue Jan 11, 2025 · 1 comment

Comments

@Fanqyu
Copy link

Fanqyu commented Jan 11, 2025

Hi,

Thanks for your excellent work!

I’ve been working on transferring GIGA to a new custom dataset, but I’ve encountered a few issues. The inputs strictly follow the required format, but I noticed that you didn’t use the data augmentation from VGN. Based on GIGA’s requirements, I adapted the data augmentation code accordingly. However, even after applying it as expected, the training still seems slower than anticipated.

Additionally, I’m curious about GIGA’s performance on smaller-scale datasets. Specifically, I’m working with a dataset where both the Pile and Packed categories contain 5000 scenes each. After experimentation, I found that VGN can handle this scale.

If you’ve already experimented with or have any insights regarding these issues, I would greatly appreciate your guidance.

Thanks a lot!

@Steve-Tod
Copy link
Collaborator

Hi, we haven't tried training GIGA with a smaller dataset. My intuition is that a small dataset might not be enough for training both the occupancy and implicit affordance network. I would suggest removing the occupancy branch and trying training giga-aff and see if the convergence is faster.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants