-
-
Notifications
You must be signed in to change notification settings - Fork 21.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Expose OpenXR raw hand tracking data #78032
Expose OpenXR raw hand tracking data #78032
Conversation
f7f5681
to
72131e8
Compare
Hey, since a few friends were asking me about this I wanted take a look. Overall seems pretty straightforward. I'm a little uncertain about the Array-of-Dictionary return value. I know there are some existing APIs which do this, but there's a lot of allocations needed and no auto-complete. Since it's only 2 (velocity) or 3 (position) values each, I'm wondering if it would be possible to separate each value into its own function call and return a TypedArray or PackedArray for the Vector3 values and TypedArray for the rotations? Something like Vector<Vector3> get_hand_joint_positions(Hand p_hand);
TypedArray<Quaternion> get_hand_joint_rotations(Hand p_hand);
Vector<float> get_hand_joint_radii(Hand p_hand);
Vector<Vector3> get_hand_joint_linear_velocities(Hand p_hand);
Vector<Vector3> get_hand_joint_angular_velocities(Hand p_hand); Another approach would be for example having a getters which pass in a joint index for each: Vector3 get_hand_joint_position(Hand p_hand, Hand::HandJoints p_joint);
Quaternion get_hand_joint_rotation(Hand p_hand, Hand::HandJoints p_joint);
float get_hand_joint_radius(Hand p_hand, Hand::HandJoints p_joint);
Vector3 get_hand_joint_linear_velocity(Hand p_hand, Hand::HandJoints p_joint);
Vector3 get_hand_joint_angular_velocity(Hand p_hand, Hand::HandJoints p_joint); |
Another thing that should be added is the ability to set_motion_range, and also I would prefer for it not to use a dictionary due to the function most likely being called every frame. Also, I encountered a problem where the left hand is the right hand and the right hand cannot be accessed. |
This solution would be better do to no memory allocations every frame |
@lyuma good suggestions, that shouldn't be a hard change and indeed would probably be more efficient memory wise. @Faolan-Rad shouldn't be hard to also expose motion range, we already implement it in the API, just need to expose it to GDScript |
72131e8
to
f4a4d92
Compare
@lyuma @Faolan-Rad I've made the requested changes, indeed neater this way! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks good to me! Thanks
@BastiaanOlij Can you do the requested style fixes? |
f4a4d92
to
58df9bd
Compare
ah thank for the poke, done! |
Thanks! |
This is perfect (in terms of accessing the hand tracking data so that I can map it to whatever non-OpenXR compatible hand shape that I usually have to hand (eg where the bones are aligned with the Y-axis as is the convention from Blender models)), except for one small issue... We can't tell whether the hand positions are valid without using a completely separate OpenXRHand node and checking its Is it possible to add in some kind of hand_activated and/or confidence rating into this interface? These might be more applicable than the finger bone velocities (though I might experiment with those at some stage to see what happens when you project the it forwards by 0.01seconds). |
Thanks @goatchurchprime , sorry had forgotten this was already merged when I asked you to comment. I'll find a moment next week to add in the confidence information, makes perfect sense we expose that too. |
This might be higher priority than I thought since I can't make the OpenXRHand node (needed only for the active/tracking_confidence flag) work when I use it in the Godot XR Tools Demos. The docs file say that it needs to be a direct child of the XROrigin3D, and there are two origins when running the demos. I've tried every combination of putting these nodes in either of them, in and out of their base scenes but the visible flag is always false even when the hands are tracking. I know that OpenXRHand works in my other simpler project (no instanced scenes, no dynamically loaded second scenes), so it is mind-boggling I can't reproduce a working version by hacking the XR Tools Demo. Since I can't properly state the bug in a way that would mean it would be debuggable, I don't want to report it. However, an active/tracking_confidence function would mean that the OpenXRHand object would not need to be used. [I'd go further to say that OpenXRHand could be deprecated since OpenXR-compatible hand skeletons are as rare as hen's teeth] |
This PR adds methods to
OpenXRInterface
that retrieve the raw tracking data from OpenXRs hand tracking. This has been requested by a few people who wish to apply the tracking data to full body avatars or who just want to use the data outside of the skeleton implementation.