Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Would you recommend me a way to get 360 degree view? #480

Closed
timegate opened this issue Sep 11, 2020 · 3 comments
Closed

Would you recommend me a way to get 360 degree view? #480

timegate opened this issue Sep 11, 2020 · 3 comments

Comments

@timegate
Copy link

timegate commented Sep 11, 2020

❓ Questions and Help

Would you recommend me a way to get 360 degree view using habitat-api?
I tried simply changing
config.SIMULATOR.RGB_SENSOR.HFOV = 360
config.SIMULATOR.DEPTH_SENSOR.HFOV = 360
config.SIMULATOR.SEMANTIC_SENSOR.HFOV = 360
But it doesn't work :( and it shows only images that are full of one color.

When I used mp3d dataset and for the default value(HFOV=90), it worked well.

@Skylion007
Copy link
Contributor

Skylion007 commented Sep 11, 2020

Yeah, that won't work with the Pinhole camera model. The Pinhole camera model is only defined for 0<=x<180 FOV.

Currently, we are working on a hack where you render a cubemap for a 90 degree FOV in all 6 cardinal orientations and then stitch them together in a single equirectangular image using a cubemap2equirectangular projection as a PyTorch layer. Long term, we plan to write a cubemap shader in Habitat-Sim to support this faster and better use the native GPU cubemap texture units.

If you want multiple cameras on your agent, you can use #472 to setup multiple visual sensors with different orientations and UUIDs and then stitch them together somehow.

@timegate
Copy link
Author

Yeah, that won't work with the Pinhole camera model. The Pinhole camera model is only defined for 0<=x<180 FOV.

Currently, we are working on a hack where you render a cubemap for a 90 degree FOV in all 6 cardinal orientations and then stitch them together in a single equirectangular image using a cubemap2equirectangular projection as a PyTorch layer. Long term, we plan to write a cubemap shader in Habitat-Sim to support this faster and better use the native GPU cubemap texture units.

If you want multiple cameras on your agent, you can use #472 to setup multiple visual sensors with different orientations and UUIDs and then stitch them together somehow.

Wow, thanks for your kind comment. I'm very happy to know that you are working on the hack and thanks for your help!

@Skylion007
Copy link
Contributor

@timegate #478 should be able to mimic a 360 sensor

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants