-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to input a geometric proxy to help guide meshing? #402
Comments
There is no such feature yet. One thing that would be nice to add in Meshroom is the ability to use RGBD images in input and use the provided depth maps in the dense part of the pipeline. Best, |
Thank you for getting back to me so quickly; is the potential future feature (to add RGBD images as input) connected with what is described in issue #399? |
Yes, it's related. The only difference is that if the depth maps are rendered (like in Issue 399) the scale of the scene will the same between the rendered depth maps and the reconstructed cameras. |
I think you are way short of the number of photos that you really need to recreate that environment. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
is this feature still being considered? |
AFAIK, there is no active development on this topic, but it is an interesting subject and we would be happy to support any initiative in this area. |
No and not that I know of. |
Hi,

I have a somewhat unusual question. I have acquired an input scene with ~150 photos, which have been well matched and aligned.
Due to some thin and feature-poor regions of the scene, the meshing stage generates poor results (see the underside and edges of the table, leading to distorted geometry under the floor).
However, I have a rough geometric model of the scene acquired using active methods (the depth sensors on an AR HMD).
I am wondering if there is any feature in meshroom or similar software that would allow me to inject the rough geometric model as a constraint, to encourage the meshing to generate a refined model that is more physically plausible given this geometric proxy.
The text was updated successfully, but these errors were encountered: