-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fitting code #7
Comments
Hi The UV coordinates is embedded in the model file. You can refer to this function to see how I export the geometry with uv coordinates. Thanks! |
Hey |
The RGB-based result in our paper is from a learning based method (I2L-MeshNet) not fitting. If you have one or multiple images, I suggest first get the joint positions and then regressing the parameters according to joint positions. The optimization code should be quite similar to nr-reg. |
If I understand this correctly, this will only give you the geometry, right? How do you get the appearance? |
You can use the photometric loss described in HTML. The process would be setting appearance parameter as an variable for optimization, then for each image, use a differentiable renderer (such as Pytorch3D) to compute photometric loss. You can set a fixed lighting condition for all views. |
Hi,
Thanks for the amazing work. I want to try NIMBLE for my project. Do you have the optimization code for getting the NIMBLE parameters using the ground-truth mesh and texture map?
Also, can you provide the UV coordinates for your mesh? Right now I am using the MANO UV coordinates to generate my texture map, but that is not aligned with the NIMBLE-generated maps.
Best,
Akshay
The text was updated successfully, but these errors were encountered: