-
Notifications
You must be signed in to change notification settings - Fork 436
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unity Rgb-d Simulated Camera to ROS #345
Comments
This issue has been marked stale because it has been open for 14 days with no activity. Please remove the stale label or comment on this issue, or the issue will be automatically closed in the next 14 days. |
Thank you posting this. I will look into this issue. |
Thank you for taking the time |
Do you have the problem of frame drop when you publish images to ROS and show them in Rviz? My framerate can drop from ~200Hz to 40Hz when publishing images. |
Dear @vidurvij-Unity i am working on this problem as well, for me the problem is that the publishing and the rendering of the cameras is done in the same function and therefor my depth camera to wait till the color camera is finished. I know that rendering is bound to one thread so stacking an infinite amount of cameras is not possible. |
Right now another problem is that packing the image data into a message is taking up a lot of time (20-30 ms) by EncodeToJPG() as the data should be compressed to avoid huge data to be transferred. can this process also be packed into a thread? |
you might be interested in the implementation at https://github.com/fsstudio-team/ZeroSimROSUnity also i raised an issue with a problem on the distortion of the pointcloud (due to what i believe incorrect depth data representation and possibly cam info) fsstudio-team/ZeroSimROSUnity#20 also the format of the depth image should be 32fc1 and i'm not sure if you can use EncodeToJPG() |
@panagelak the project is very interesting especially the use of AsyncGPUReadbackRequest. |
Hello i really like this project!
I believe the Community miss an easy example/functionality of simulating an rgbd camera in Unity and transferring the topics to Ros efficiently.
I have created a starter project for this purpose here : https://github.com/panagelak/rgbd_unity_camera
I believe the best way to do that is to send into ros 4 topics namely
Then we can decompress the rgb and depth images into ros and combine them with the camera info topics to construct colored or not pointclouds, through the image transport packages.
For convenience i have included this packages from the ros image_transport packages, which i have modified slightly in order to include them as libraries.
I tried to do it by myself but unfortunately i am not very familiar with Unity and shaders : (
I also included some code from here : https://www.immersivelimit.com/tutorials/unity-depth-camera-simulation
But the depth image i managed to get was a gray image of rgb format instead of the 32FC1 that ros expects in depth images.
So i think the difficult part will be to do the following two things
If somebody can help me with that, that would be great!
You can make me a pull request on my repo for now (ros-noetic, Unity 2020.3.25), in order to not have to meet high standards but just a working code
and later i think it would be a great addition to have as an example (along with ros actions P)
Thank you,
I am awaiting your input
The text was updated successfully, but these errors were encountered: