-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Submitted models/marble hd2 sensor config 3 #516
Submitted models/marble hd2 sensor config 3 #516
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've got a few things that need to be addressed:
- When I try to view the depthcloud in rviz (for example, topic
/ROBOT_NAME/back/image_raw
), I am given the following error message in rviz (for both the regular andoptical
frame):
Error updating depth cloud: Depth image has invalid format (only 16 bit and float are supported)!
- When viewing the camera output in rviz, the lighting seems to be very dark (the robot doesn't seem to have any lights on it) - is this expected?
- Sometimes, when I drive the robot manually, I'll tell it to go straight forward (positive
x
command velocity only), but after a few seconds, the robot will oscillate significantly to one side or another as it moves forward - this seems like a bug from the model side that should be investigated/addressed.- Also, the robot seemed to respond sluggish to my command velocities, but perhaps that's due to limited resources on the machine I'm testing the model with.
- The model is a solid gray color in the gazebo simulator - is that expected, or should it be colored?
- This model has a breadcrumbs topic, but when I try to drop a breadcrumb, nothing happens. Can you get breadcrumbs fixed, or remove them from this model if this model does not support breadcrumbs?
<min_velocity>-1</min_velocity> | ||
<max_velocity>1</max_velocity> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These velocities seem pretty low - are these the correct values, or should they be higher?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Owing to COVID-19 we do not have validation data. I will ask MARBLE if that has changed in the past couple weeks. Those were the values assigned in leu of not having validation data.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Owing to COVID-19 we do not have validation data. I will ask MARBLE if that has changed in the past couple weeks. Those were the values assigned in leu of not having validation data.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The HD2 model will continue to have 1 m/s maximum velocity (matching X1 and Husky) until validation can be provided. Matching the speed to the most similar pre-existing model avoids any new model having a speed advantage that has not been validated.
submitted_models/marble_hd2_sensor_config_3/launch/vehicle_topics.launch
Show resolved
Hide resolved
…led model appearance
Let me know if you have any further comments/questions in response to these responses. |
It looks like I was trying to view the wrong topic in rViz - I don't see this error anymore either.
Lights look good!
Although this may not be a problem for your team since you're using controllers, I'm hesitant to release a model with this navigation bug for other teams to use. I'll talk with the other people on the SubT team and get back to you regarding the severity of this bug for release.
The reason why textures are not loading properly is because
I just tried dropping breadcrumbs again, but I am still not seeing anything dropped by the robot. Looking at your latest commit, it doesn't look like subt/submitted_models/costar_husky_sensor_config_2/launch/spawner.rb Lines 55 to 69 in a1a886b
Also, once you're done adding breadcrumbs, would you mind adding to |
Co-authored-by: adlarkin <42042756+adlarkin@users.noreply.github.com>
…/github.com/bfotheri/subt into submitted_models/marble_hd2_sensor_config_3
DEVELOPMENT: In regards to the strange steering I did a little investigation and found that this problem exists in all the configurations so it is likely to do with the fact that we approximate the treads by using four wheels on each side. When these models were made I did not have access to cave circuit worlds and so I tested them in urban environments. It appears that this vehicle drives fine on flat or tilted surfaces, but when on uneven topography, only some of the wheels are in contact with the ground, often an unequal number on the right and left sides which causes the strange veering to the right or left. I know gazebo had a track vehicle plugin but ignition does not. This is kind of a big item so let me know what you guys think. |
Quick question, Is there any chance this will be accepted before cave_circuit? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sensor configuration will not be incorporated for Cave Circuit, but feedback is included here and in-line for later consideration.
- Please connect floating sensors and add shading to the existing robot mesh. We will provide assistance in removing preexisting sensors that are not present on this robot model sensor configuration from the single mesh if you are unable to.
- Please provide sensor specification documentation for the simulated HD MIPI cameras.
- Please adjust the rear facing HD MIPI sensor prefix to be
rear
as opposed toback
to align with the established SubT API. - Please adjust the OS-1 sensor data topics to
horizontal_points
andvertical_points
fromhoriz_points
andvert_points
respectively to align with the established SubT API. - Please continue to provide information and look into the issue with unintended veering over uneven topography. We will continue to provide support to your effort and pursue the issue in parallel.
- The validation data is missing. Please complete the validation testing as required by the Simulation Model Preparation Guide and submit the required data.
submitted_models/marble_hd2_sensor_config_3/launch/vehicle_topics.launch
Outdated
Show resolved
Hide resolved
submitted_models/marble_hd2_sensor_config_3/launch/vehicle_topics.launch
Outdated
Show resolved
Hide resolved
* Sensor specification links: | ||
* D435i RGBD Camera - https://www.intelrealsense.com/depth-camera-d435i/ | ||
* (2x) Ouster 3D Lidar (64 Channel) - https://ouster.com/products/os1-lidar-sensor/ (one vertically oriented and another horizontally oriented) | ||
* RPLidar S1 Planar Lidar - https://www.slamtec.com/en/Lidar/S1Spec | ||
* IMU: Microstrain 3DM-GX5-25 - datasheet: https://www.microstrain.com/sites/default/files/applications/files/3dm-gx5-25_datasheet_8400-0093_rev_n.pdf | ||
* Explanation of sensor parameter derivations: | ||
We derived the stddev terms as follows: | ||
|
||
accelerometer noise density = 0.00002 g/sqrt(Hz) | ||
=> convert to m/s^2 => 1.962e-4 m/s^2 | ||
gyro noise density = 0.005 deg/s/sqrt(Hz) | ||
=> convert to rad/sec => 8.72664e-5 radians | ||
|
||
Other terms are difficult to extract from datasheet, so we used similar terms to previous IMU models proposed (of similar or worse quality) such as the ADIS 16448 (which has worse performance than this IMU). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add supporting documentation for the HD MIPI cameras.
submitted_models/marble_hd2_sensor_config_3/launch/vehicle_topics.launch
Outdated
Show resolved
Hide resolved
Thank you @acschang and @azeey for your review findings/comments.
@bfotheri I was able to talk with someone else on the SubT team to get this resolved. There's currently a bug that can't handle PBR files paths that are relative if the model has not been uploaded to fuel yet (see this issue and this comment). Once your model is approved, merged, and uploaded to fuel, this should no longer be an issue. However, for the time being, in order to test this model's materials/textures locally, you can do the following temporary workaround:
As @acschang mentioned, it looks like some sensors are "floating" in this model. |
…ocs for the camera, and figure out steering issue
Update: The driving issue has been fixed. One of the wheels was larger than the others. This has been rectified in my latest commits and the behavior appears to have been resolved. @azeey Thank you for finding this! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sensor configuration is under consideration for the final circuit. The currently outstanding tasks are listed below:
- Please connect floating sensors and add shading to the existing robot mesh. We will provide assistance in removing preexisting sensors that are not present on this robot model sensor configuration from the single mesh if you are unable to.
- The provided documentation for the HD MIPI cameras lists the cameras to have a 808x608 resolution, 50 Hz frame rate, and the listed lens has a 80 degree FoV. The parameters listed for the modeled sensor are 1280x960 resolution, 15 Hz frame rate, and a 60 degree FoV. Please either correct the discrepancy or provide an explanation for the simulated sensor parameters differing from the provided documentation.
- The validation data is missing. Please complete the validation testing as required by the Simulation Model Preparation Guide and submit the required data.
The bounding box for this model is
|
The marble virtual track team has updated their systems vehicles since we submitted their models. We talked with Angela and she said if they had built the vehicles perhaps you guys would consider these additional sensor configurations. Here are the images showing the two vehicles and reflecting this vehicle sensor configuration. I