Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do you have a plan to pre-train Uni3D on Objaverser-xl, a 10M-scale 3D dataset? #20

Open
auniquesun opened this issue Jun 21, 2024 · 1 comment

Comments

@auniquesun
Copy link

Thanks for sharing the paper and code. It's a great work.

Since Uni3D has scaled the point encoder to 1B parameters, that are rather big. But the scale of Objaverse 1.0 is only 800K 3D objects, I think it is still relatively small to support the 1B-scale point encoder pre-training and the generalization is still far behind the counterparts in image and text field.

Now Objaverse-xl is released, which contains 10M+ 3D objects, does your team have such a plan to pre-train on larger Objaverse-xl? I think BAAI has the computing source to finish such a task. How do you think about it?

@QiopeWallt
Copy link

I also think scaling up pre-training using the Objaverse-xl dataset could greatly improve the generalization of the point encoder. I’m curious—has your team processed the Objaverse-xl dataset for Uni3D adaptation yet? If not, perhaps there’s an opportunity for collaboration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants