-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unity Support #36
Comments
@nadir500 We have gotten several request to support MediaPipe in Unity engine. Can you share with us your use cases? We love to work with some teams to get contribution to MediaPipe for getting MediaPipe into Unity |
Some potential use cases:
|
Actually there is an sdk from vive port that supports hand tracking on unity The sdk is still early access I don't know the difference between the two implementations but I guess it would be great if there is some sort of cooperation between the two teams to poost the process. And for some use cases, check the videos here |
+other folks
…On Wed, Aug 21, 2019 at 6:34 PM Logon13 ***@***.***> wrote:
Some potential use cases:
- VR Hand tracking (similar to devices like leap motion). Some VR
headsets such as the HTC Vive Pro and Valve Index already have RGB cameras
built in
- AR hand tracking
- Object detection/segmentation using cell phone camera/hololens
camera/pc camera
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#36>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAQTXUNPXU6O3CSVNSUSI2LQFXUJNANCNFSM4IOHEELQ>
.
|
I also support this! |
Would love to get a Unity port as well. There aren't any open-source options for hand-tracking in AR, especially for mobile devices. This works perfectly on my phone. There are some like ManoMotion which support hand tracking in 2D but they are paid and in closed beta. If this can be used with Unity then that would help a lot of developers around who are looking to integrate a more natural interaction into their Augmented Reality experiences. The use case for VR is even more obvious. |
Maybe this can be in the style of c api from tflite? And the string containing the definition of the graph can be passed from the .Net runtime to the native API with PInvoke calls. I would say it can be even possible to create custom calculators in C#, and the managed methods ( The incentive would be to make it possible to use alongside arcore-unity-sdk (in a fashion where ARCore will be passing the camera image to mediapipe through the CameraImage), and maybe ARFoundation as well (which also has an API for retrieving the camera image), so it would be in the form of a subsystem. These are only ideas as I didn't dive into mediapipe enough to have a solid opinion |
+Chris McClanahan <cmcclanahan@google.com> +Chuo-Ling Chang
<chuoling@google.com> +Matthias Grundmann <grundman@google.com>
…On Thu, Aug 22, 2019 at 9:10 PM Brahim Hadriche ***@***.***> wrote:
Maybe this can be in the style of c api from tflite?
<https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/experimental/c/c_api.h>
And the string containing the definition of the graph can be passed from
the .Net runtime to the native API with PInvoke calls.
I would say it can be even possible to create custom calculators in C#,
and the managed methods (GetContract(), Open(), and Process()) can be
passed to the C API as a function pointer to be invoked from there.
The incentive would be to make it possible to use alongside
arcore-unity-sdk <https://github.com/google-ar/arcore-unity-sdk> (in a
fashion where ARCore will be passing the camera image to mediapipe through
the CameraImage
<https://developers.google.com/ar/reference/unity/class/GoogleARCore/Frame/CameraImage>),
and maybe ARFoundation
***@***.***/manual/index.html>
as well (which also has an API for retrieving the camera image)
These are only ideas as I didn't dive into mediapipe enough to have a
solid opinion
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#36>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAQTXUJI5RMHIFCILMMZ6VLQF5PKDANCNFSM4IOHEELQ>
.
|
The aristo API from htc currently is much inferior to the Google's approach due to the API that relies on the stereo cameras from vive pro, as tested on original vive mono camera it works really bad, and the Android version is very limited and burns the battery easily thus limiting it's use with other computational heavy tasks like XR. |
Hi I can add a few more ideas, and would say there is a spectrum (as always in programming) of how to solve this (i.e. adding Unity support):
On the right side, there is the ARCore approach, where the majority of ARCore API is mapped into C#, including all the C/C++/C# wrappers. This obviously requires a lot of C# code (and C interfaces) to be written, but provides the greatest flexibility on developer use-cases and how tightly you can integrate into the C# application. On the left side, there is the option of minimizing the amount of C#/C-wrapper that needs to be written, via writing a native class (similar to hello_world.cc, or FrameProcessor.java) to handle all the graph initialization/running. In the simplest case, there could be just a few functions in a custom graph runner exposed C#: InitAndStartGraph, AddPacket, RetrieveResultPacket, ShutdownGraph. This would be more of a black-box approach, treating running a graph like calling a function. I think depending on the application, one approach may be more fitting than the other (considering amount of effort involved), or some hybrid of the two. A side note for future reference:
|
I have been looking for this solution as well. I fully support this effort! |
HTC Vive solution is not good enough compared with this mediapipe handtrak. I tested both, and even Vive solution runs on a desktop, is to not compare with mediapipe runing inside a slow android smartphone. Mediapipe handtracking is stunning good and it will be a gamechanger in UI in the nearfuture. |
-VR hand tracking for Oculus Quest/Rift S. It is so important for getting a better user experience with more natural interaction input, finger tracking is definitely on top of the list. Also there will be more and more VR HMD using inside-out tracking meaning they all have camera onboard. |
I hope this feature too. It is very importance for us. |
I fully support this. imagining been able to interact with the game/object itself using hand tracking. |
A step in the right direction would be headless hand tracking and using UnityPlayer.UnitySendMessage to send the coordinates from Android to Unity Activity. Then just position the hand on the camera viewport according to received coordinates. First step could be just adding some code to the current "Hand Tracking GPU" app to send data to a server running in Unity Editor during play mode. That itself could become a great development tool. I might try to do this myself but if someone does something like this, please post your results! |
There are other models for hand tracking but probably not so performant as the googles version https://github.com/lmb-freiburg/hand3d |
Hi, just wondering why this issue is closed now? Is the Unity plugin available? |
Like Seberta, I'd like to know. Has any headway been made on this endeavor? |
Unity support would be an excellent approach. Especially for variety of Augmented Reality apps and solutions, the Mediapipe Plugin for Unity would be a great help. |
I kinda find this weird that feature request is closed without any answear. |
I also support this. When will it be available ? |
We are looking into this and will update the thread when we have something more definite. We welcome contributions from folks in the thread |
I am also looking forward to this. A lot of clients are asking me hand and finger tracking for AR apps on iOS and Android. Any news about this? Thank you! |
Also interested in support of this. |
Interested too |
Interested in Unity support for it |
@midopooler Nothing until now, like it seems. |
Why this issue was closed though? We should have unity integration of this library |
Same opinion. |
Same here. |
It's been 2 years, I could write an essay of use cases if you need. Many current projects would benefit from this |
Maybe this could help? https://github.com/homuler/MediaPipeUnityPlugin I didnt test it, but got it on my list. |
The main challenge in Unity currently is to make hands tracking with depth perception an option in vr/ar interactions using mobile camera instead of tracking devices such as leap motion which has no android sdk, of course it's all going to be tuned in real-time since the slightest lag would hurt the experience on some cases. |
Unity support for MediaPipe would be really welcome. Anyone who has worked in the AR/VR/MR applications would love a stable hand-tracking option without being dependent on expensive and/or walled-off hardware to pull it off. I have seen some really good work done on hand-tracking done with just a single RGB camera by some university teams a few years back but none of them were opened up to public, and I won't be surprised if most of them ended up in either Facebook or Microsoft's collection of patents. OpenCV seems to be the only open option available at the moment but the hand-tracking options are not polished enough to be used commercially. Options like ManoMotion are way too expensive for individual developers. |
Yes, we really need this stuff for Unity! |
Support for Unity barracuda would make more benefit for the developers. |
Just an API would be really neat to have to make really cool 3D applications |
@rtrn1337 have you tried the plugin? |
yes i did. Some features are working in Unity Editor. But I get an error in Xcode when I want to build on a device. I haven't had time to test it more closely yet. |
I was following this thread and it started long back where there are not much devices which support AR Core. But, now there are many. Would be great if you can share the use-case list! |
Hi, |
@EigenSpiral |
@boehm-e Thank you for sharing! |
There isn't really depth with Mediapipe so Leap or any depth sensing camera is better there. I would strongly advise you try the linked sample repo above and compare for yourself in your own project! |
We recently release Easy ML Kit which targets to get google's ML Kit, Media Pipe and Tensorflow in Unity. Currently we released 2 features (Barcode scanning and object detection) and others are in developement. We have ARFoundation Camera as one of the input sources (along with live cam and image/textures) which will be quite useful for AR Apps. |
Hi,
It would be great if there was a way to connect the Mediapipe finger
tracking with controlling virtual hand objects in Unity.
…On Sat, May 7, 2022 at 5:02 AM IP ***@***.***> wrote:
We recently release Easy ML Kit <https://u3d.as/2PMe> which targets to
get google's ML Kit, Media Pipe and Tensorflow in Unity. Currently we
released 2 features (Barcode scanning and object detection) and others are
in developement.
We have ARFoundation Camera as one of the input sources (along with live
cam and image/textures) which will be quite useful for AR Apps.
Would be great if you can share us what features you are expecting so we
can prioritize.
—
Reply to this email directly, view it on GitHub
<#36 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AXCFJIYDGAEACK3X7CNAL7TVIYWSXANCNFSM4IOHEELQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Sure! Will definitely have a note! |
Can we see Unity Engine port of it?
It would so much great of we used it inside the engine
The text was updated successfully, but these errors were encountered: