Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Submitted models/marble hd2 sensor config 3 #516

Closed
wants to merge 14 commits into from
Closed

Submitted models/marble hd2 sensor config 3 #516

wants to merge 14 commits into from

Conversation

bfotheri
Copy link
Contributor

The marble virtual track team has updated their systems vehicles since we submitted their models. We talked with Angela and she said if they had built the vehicles perhaps you guys would consider these additional sensor configurations. Here are the images showing the two vehicles and reflecting this vehicle sensor configuration. I

tmp_1596226782657
tmp_1596226820604

@nkoenig nkoenig requested a review from adlarkin August 4, 2020 16:27
Copy link
Contributor

@adlarkin adlarkin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've got a few things that need to be addressed:

  1. When I try to view the depthcloud in rviz (for example, topic /ROBOT_NAME/back/image_raw), I am given the following error message in rviz (for both the regular and optical frame):
Error updating depth cloud: Depth image has invalid format (only 16 bit and float are supported)!
  1. When viewing the camera output in rviz, the lighting seems to be very dark (the robot doesn't seem to have any lights on it) - is this expected?
  2. Sometimes, when I drive the robot manually, I'll tell it to go straight forward (positive x command velocity only), but after a few seconds, the robot will oscillate significantly to one side or another as it moves forward - this seems like a bug from the model side that should be investigated/addressed.
    • Also, the robot seemed to respond sluggish to my command velocities, but perhaps that's due to limited resources on the machine I'm testing the model with.
  3. The model is a solid gray color in the gazebo simulator - is that expected, or should it be colored?
  4. This model has a breadcrumbs topic, but when I try to drop a breadcrumb, nothing happens. Can you get breadcrumbs fixed, or remove them from this model if this model does not support breadcrumbs?

Comment on lines +27 to +28
<min_velocity>-1</min_velocity>
<max_velocity>1</max_velocity>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These velocities seem pretty low - are these the correct values, or should they be higher?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Owing to COVID-19 we do not have validation data. I will ask MARBLE if that has changed in the past couple weeks. Those were the values assigned in leu of not having validation data.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Owing to COVID-19 we do not have validation data. I will ask MARBLE if that has changed in the past couple weeks. Those were the values assigned in leu of not having validation data.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The HD2 model will continue to have 1 m/s maximum velocity (matching X1 and Husky) until validation can be provided. Matching the speed to the most similar pre-existing model avoids any new model having a speed advantage that has not been validated.

@bfotheri
Copy link
Contributor Author

  1. I was unable to reproduce this error. The depth cloud points on /ROBOT_NAME/front_down_rgbd_camera/depth/points were visualized in RVIZ with no errors or warnings for me. /ROBOT_NAME/front_down/optical/depth and /ROBOT_NAME/front_down/depth were able to be visualized using rqt_image_view with no problems or errors as well.

  2. Good catch, I've added some lights based off the robot models.

  3. I was able to reproduce this. It isn't clear yet to me but it appears to be a combination of the multi-wheel approximation of the tread and perhaps the collision model. We will investigate this but for now the controllers we use are able to compensate for this so it's not a huge problem for us.

  4. When I uncomment lines 155-166 of the model.sdf to enable the textures and materials of the model, the simulation is unable to find the paths and consequently crashes. Much trial and error has been unable to solve the problem. It may however just be an artifact of my local setup.

  5. The necessary plugin has been added to the spawner.rb file. This should work now.

Let me know if you have any further comments/questions in response to these responses.

@osrf osrf deleted a comment from adlarkin Aug 18, 2020
@adlarkin
Copy link
Contributor

  1. I was unable to reproduce this error. The depth cloud points on /ROBOT_NAME/front_down_rgbd_camera/depth/points were visualized in RVIZ with no errors or warnings for me. /ROBOT_NAME/front_down/optical/depth and /ROBOT_NAME/front_down/depth were able to be visualized using rqt_image_view with no problems or errors as well.

It looks like I was trying to view the wrong topic in rViz - I don't see this error anymore either.

  1. Good catch, I've added some lights based off the robot models.

Lights look good!

  1. I was able to reproduce this. It isn't clear yet to me but it appears to be a combination of the multi-wheel approximation of the tread and perhaps the collision model. We will investigate this but for now the controllers we use are able to compensate for this so it's not a huge problem for us.

Although this may not be a problem for your team since you're using controllers, I'm hesitant to release a model with this navigation bug for other teams to use. I'll talk with the other people on the SubT team and get back to you regarding the severity of this bug for release.

  1. When I uncomment lines 155-166 of the model.sdf to enable the textures and materials of the model, the simulation is unable to find the paths and consequently crashes. Much trial and error has been unable to solve the problem. It may however just be an artifact of my local setup.

The reason why textures are not loading properly is because MARBLE_HD2_SENSOR_CONFIG_3 needs to exist on fuel or in your fuel cache in order for the texture materials to load (but you're model is not on fuel yet since it is still in review). As a temporary workaround, you should be able to create a ~/.ignition/fuel/fuel.ignitionrobotics.org/OpenRobotics/models/MARBLE_HD2_SENSOR_CONFIG_3/1/ directory and place all relevant files there (model.*, thumbnails/, materials/, and meshes/). I tried to do this local workaround, but things still were not working for me. I am looking into this and will get back to you.

  1. The necessary plugin has been added to the spawner.rb file. This should work now.

I just tried dropping breadcrumbs again, but I am still not seeing anything dropped by the robot. Looking at your latest commit, it doesn't look like spawner.rb was modified. Perhaps you forgot to push the changes made to spawner.rb? Also, just in case you need it, here is an example of how to add the breadcrumbs plugin to spawner.rb:

<plugin filename="libignition-gazebo-breadcrumbs-system.so"
name="ignition::gazebo::systems::Breadcrumbs">
<topic>/model/#{_name}/breadcrumb/deploy</topic>
<max_deployments>12</max_deployments>"
<breadcrumb>"
<sdf version="1.6">
<model name="#{_name}__breadcrumb__">
<pose>-0.45 0 0 0 0 0</pose>
<include>
<uri>https://fuel.ignitionrobotics.org/1.0/openrobotics/models/Breadcrumb Node</uri>
</include>
</model>
</sdf>
</breadcrumb>
</plugin>

Also, once you're done adding breadcrumbs, would you mind adding to specifications.md that this model has breadcrumbs, including the maximum number of breadcrumbs that can be dropped?

@bfotheri
Copy link
Contributor Author

bfotheri commented Aug 19, 2020

DEVELOPMENT: In regards to the strange steering I did a little investigation and found that this problem exists in all the configurations so it is likely to do with the fact that we approximate the treads by using four wheels on each side. When these models were made I did not have access to cave circuit worlds and so I tested them in urban environments. It appears that this vehicle drives fine on flat or tilted surfaces, but when on uneven topography, only some of the wheels are in contact with the ground, often an unequal number on the right and left sides which causes the strange veering to the right or left. I know gazebo had a track vehicle plugin but ignition does not. This is kind of a big item so let me know what you guys think.

@bfotheri
Copy link
Contributor Author

Quick question, Is there any chance this will be accepted before cave_circuit?

Copy link
Contributor

@acschang acschang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This sensor configuration will not be incorporated for Cave Circuit, but feedback is included here and in-line for later consideration.

  • Please connect floating sensors and add shading to the existing robot mesh. We will provide assistance in removing preexisting sensors that are not present on this robot model sensor configuration from the single mesh if you are unable to.
  • Please provide sensor specification documentation for the simulated HD MIPI cameras.
  • Please adjust the rear facing HD MIPI sensor prefix to be rear as opposed to back to align with the established SubT API.
  • Please adjust the OS-1 sensor data topics to horizontal_points and vertical_points from horiz_points and vert_points respectively to align with the established SubT API.
  • Please continue to provide information and look into the issue with unintended veering over uneven topography. We will continue to provide support to your effort and pursue the issue in parallel.
  • The validation data is missing. Please complete the validation testing as required by the Simulation Model Preparation Guide and submit the required data.

Comment on lines 57 to 70
* Sensor specification links:
* D435i RGBD Camera - https://www.intelrealsense.com/depth-camera-d435i/
* (2x) Ouster 3D Lidar (64 Channel) - https://ouster.com/products/os1-lidar-sensor/ (one vertically oriented and another horizontally oriented)
* RPLidar S1 Planar Lidar - https://www.slamtec.com/en/Lidar/S1Spec
* IMU: Microstrain 3DM-GX5-25 - datasheet: https://www.microstrain.com/sites/default/files/applications/files/3dm-gx5-25_datasheet_8400-0093_rev_n.pdf
* Explanation of sensor parameter derivations:
We derived the stddev terms as follows:

accelerometer noise density = 0.00002 g/sqrt(Hz)
=> convert to m/s^2 => 1.962e-4 m/s^2
gyro noise density = 0.005 deg/s/sqrt(Hz)
=> convert to rad/sec => 8.72664e-5 radians

Other terms are difficult to extract from datasheet, so we used similar terms to previous IMU models proposed (of similar or worse quality) such as the ADIS 16448 (which has worse performance than this IMU).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add supporting documentation for the HD MIPI cameras.

@adlarkin
Copy link
Contributor

Thank you @acschang and @azeey for your review findings/comments.

When I uncomment lines 155-166 of the model.sdf to enable the textures and materials of the model, the simulation is unable to find the paths and consequently crashes. Much trial and error has been unable to solve the problem. It may however just be an artifact of my local setup.

@bfotheri I was able to talk with someone else on the SubT team to get this resolved. There's currently a bug that can't handle PBR files paths that are relative if the model has not been uploaded to fuel yet (see this issue and this comment). Once your model is approved, merged, and uploaded to fuel, this should no longer be an issue. However, for the time being, in order to test this model's materials/textures locally, you can do the following temporary workaround:

  1. Change model://MARBLE_HD2_SENSOR_CONFIG_3/ in lines 155-167 of model.sdf to use the full path. On my machine, the model files are located in /home/developer/subt_ws/src/subt/submitted_models/marble_hd2_sensor_config_3. This means that lines 155-167 in model.sdf would now look like this for me:
          <pbr>
            <metal>
              <albedo_map>/home/developer/subt_ws/src/subt/submitted_models/marble_hd2_sensor_config_3/materials/textures/HD2_Albedo.png</albedo_map>
              <metalness_map>/home/developer/subt_ws/src/subt/submitted_models/marble_hd2_sensor_config_3/materials/textures/HD2_Metalness.png</metalness_map>
              <roughness_map>/home/developer/subt_ws/src/subt/submitted_models/marble_hd2_sensor_config_3/materials/textures/HD2_Roughness.png</roughness_map>
            </metal>
          </pbr>
          <!-- fallback to script if no PBR support-->
          <script>
            <uri>/home/developer/subt_ws/src/subt/submitted_models/marble_hd2_sensor_config_3/materials/scripts/</uri>
            <uri>/home/developer/subt_ws/src/subt/submitted_models/marble_hd2_sensor_config_3/materials/textures/</uri>
            <name>HD/HD2_Diffuse</name>
          </script>
  1. Create a fuel cache for this model, which is done by creating a ~/.ignition/fuel/fuel.ignitionrobotics.org/OpenRobotics/models/MARBLE_HD2_SENSOR_CONFIG_3/1/ directory:
$ mkdir -p ~/.ignition/fuel/fuel.ignitionrobotics.org/OpenRobotics/models/MARBLE_HD2_SENSOR_CONFIG_3/1/
  1. Copy materials/, meshes/, thumbnails/, and model.* over to the newly created fuel cache directory:
# go to the model's directory in the subt repo
# (this path will probably be different on your machine)
$ cd /home/developer/subt_ws/src/subt/submitted_models/marble_hd2_sensor_config_3

$ cp -r materials/ ~/.ignition/fuel/fuel.ignitionrobotics.org/OpenRobotics/models/MARBLE_HD2_SENSOR_CONFIG_3/1/
$ cp -r meshes/ ~/.ignition/fuel/fuel.ignitionrobotics.org/OpenRobotics/models/MARBLE_HD2_SENSOR_CONFIG_3/1/
$ cp -r thumbnails/ ~/.ignition/fuel/fuel.ignitionrobotics.org/OpenRobotics/models/MARBLE_HD2_SENSOR_CONFIG_3/1/
$ cp model.* ~/.ignition/fuel/fuel.ignitionrobotics.org/OpenRobotics/models/MARBLE_HD2_SENSOR_CONFIG_3/1/
  1. Build the workspace with the subt repo and start a simulator with MARBLE_HD2_SENSOR_CONFIG_3. Here is how the model looks for me with the materials/textures loaded:

marble_model_with_materials_textures

As @acschang mentioned, it looks like some sensors are "floating" in this model.

@azeey azeey mentioned this pull request Aug 31, 2020
4 tasks
…ocs for the camera, and figure out steering issue
@bfotheri
Copy link
Contributor Author

Update: The driving issue has been fixed. One of the wheels was larger than the others. This has been rectified in my latest commits and the behavior appears to have been resolved. @azeey Thank you for finding this!

Copy link
Contributor

@acschang acschang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This sensor configuration is under consideration for the final circuit. The currently outstanding tasks are listed below:

  • Please connect floating sensors and add shading to the existing robot mesh. We will provide assistance in removing preexisting sensors that are not present on this robot model sensor configuration from the single mesh if you are unable to.
  • The provided documentation for the HD MIPI cameras lists the cameras to have a 808x608 resolution, 50 Hz frame rate, and the listed lens has a 80 degree FoV. The parameters listed for the modeled sensor are 1280x960 resolution, 15 Hz frame rate, and a 60 degree FoV. Please either correct the discrepancy or provide an explanation for the simulated sensor parameters differing from the provided documentation.
  • The validation data is missing. Please complete the validation testing as required by the Simulation Model Preparation Guide and submit the required data.

@nkoenig
Copy link
Contributor

nkoenig commented Jan 13, 2021

The bounding box for this model is

Min[-0.689611 -0.332041 -0.135] Max[0.5479 0.332197 0.68003]

@nkoenig nkoenig requested review from adlarkin and removed request for adlarkin March 1, 2021 20:27
@bfotheri bfotheri closed this Mar 30, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants