Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extracting leading vehicle in nuScenes after conversion #65

Open
samueleruffino99 opened this issue Feb 21, 2024 · 9 comments
Open

Extracting leading vehicle in nuScenes after conversion #65

samueleruffino99 opened this issue Feb 21, 2024 · 9 comments

Comments

@samueleruffino99
Copy link

samueleruffino99 commented Feb 21, 2024

Hello, thank you very much for your work.
Do you think it would be possible to extract the leading vehicle given the ego position in nuScenes dataset after conversion?
What I am thinking about is to:

  • get the lane id where the ego vehicle is in (and all the other object present in it)
  • convert to curvilinear coordinate based on that lane (I was wondering whether there is already a centre line in nuSCenes, since LANES data have both 'polylinbes' and 'polygons')
  • compute the distance in curvilinear coordinate between ego vehicle and all other objects present in that lane.

I am not quite sure about whether using just lanes or also other objects, actually.
Anyway, I was wondering whether you have some implementations that extract the lane from the vehicle position or something like that, also in curvilinear coordinates (in metadrive as well).

@QuanyiLi
Copy link
Member

Yeah, we do have that tools. If the map is loaded with ScenarioMap, you can get all lanes by map.road_network and get the polygon for each lane by lane.polygon. If there is no polygon for the lane, we will generate one from its lane center line, so this API will definetely return a polygon. With these polygons, you can easily find which polygon the point (vehicle position) is in and hence get the lane id and lane center line. You can prompt GPT to write the point-in-polygon code :).
Actually, there is an advanced function, ray_localization(), allowing you to do this directly. Check: https://github.com/metadriverse/metadrive/blob/c326ae5f6b409ed90d2bcda8b5bc12689c8c03b5/metadrive/component/navigation_module/edge_network_navigation.py#L159
It will return a set of lanes that the point is on. I think this one might be faster because we do the point-in-polygon calculation with physics engine API.

For getting curvilinear coordinates, check PointLane
https://github.com/metadriverse/metadrive/blob/c326ae5f6b409ed90d2bcda8b5bc12689c8c03b5/metadrive/component/lane/point_lane.py#L17 The polyline of a lane is indeed the centerline. So just create a PointLane object with the lane center line. Then you can get the longitudinal and lateral position given an object with lane.local_coordinates(position)

@samueleruffino99
Copy link
Author

Do you also have some tool to get the leading vehicle ?
Thank you very much for your help!

@QuanyiLi
Copy link
Member

@samueleruffino99
Copy link
Author

samueleruffino99 commented Feb 22, 2024

Screenshot 2024-02-22 151050
I am plotting here the all the objects that intersect with the current lane where the ego vechile is. However it seems that the ego and other vehicles are misaligned in some way (maybe I have some bug in the code). It is strange since I am using the same function to plot the occupancy box given a state of an object.
`def get_object_occupancy(self, state):
"""Extracts the occupancy of the object without considering velocity but only rotation
Args: state (State): The state of the object
Returns: Polygon: The occupancy polygon rotated based on heading"""

    # Extract the position, heading, length, and width from the state object
    obj_position = state.position[:2]
    obj_heading = -state.heading
    obj_length = state.length
    obj_width = state.width

    # Compute the rotation matrix
    cos_angle = np.sin(obj_heading)
    sin_angle = np.cos(obj_heading)
    rotation_matrix = np.array([[cos_angle, -sin_angle],
                                [sin_angle, cos_angle]])

    # Define the vertices of the box relative to its center
    half_length = obj_length / 2
    half_width = obj_width / 2
    vertices_relative = np.array([[-half_length, -half_width],
                                  [half_length, -half_width],
                                  [half_length, half_width],
                                  [-half_length, half_width]])
    
    # vertices_relative = np.array([[-half_width, -half_length],
    #                               [half_width, -half_length],
    #                               [half_width, half_length],
    #                               [-half_width, half_length]])

    # Rotate the vertices using the rotation matrix and translate to the object's position
    rotated_vertices = np.dot(vertices_relative.squeeze(), rotation_matrix.T) + obj_position

    return rotated_vertices`

Apparently it works for ego but not for other objects, why could this happend?
I have also checked the object size, and apparently length and width for ego is different wrt to other objects (width larger in ego, while legth larger in others).

@QuanyiLi
Copy link
Member

Where is the scenario from? Waymo? or NuScenes?

@QuanyiLi
Copy link
Member

QuanyiLi commented Feb 23, 2024

If you are testing with Nuscenes data, it is a bug from nuscenes... We extract the length and width for all nuScenes objects with the same nuScenes API, but the ego car's width and length turn out to be inverse. So if it is a nuScenes data, the length is actually width and width is length.

@samueleruffino99
Copy link
Author

Yes, I am using nuScenes. I switched width and length for ego vehicels and it works. But apparently I have problems with the velocities as well (see attached pictures). All the velocities directions are correct, except for the ego one (it should be roatte by 180 deg).
In the first image the ego vehicle is going in the up-left direction in time. This nuScenes scene has been converted with your API few months in the past.
Screenshot 2024-02-23 144345
In this other picture all the velocity are correct (converted few days ago).
id you paraphs fixed a bug about the velocities?
Screenshot 2024-02-23 144639

@QuanyiLi
Copy link
Member

Yes, I guess so. The previous nuScenes converter may be buggy. If the recent one is good, that's fine. Also, you can find some scenarios in MetaDrive/assets, which can serve as your test case as well.

@samueleruffino99
Copy link
Author

Actually I am having problems with the current version.
Anyway I will check with your test cases and update you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants