Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for more virtual devices #5

Open
ABeltramo opened this issue Mar 18, 2024 · 8 comments
Open

Add support for more virtual devices #5

ABeltramo opened this issue Mar 18, 2024 · 8 comments

Comments

@ABeltramo
Copy link

Hello there!
It looks like we've both been busy re-writing Sunshine from scratch for the past few years! I've been working on games-on-whales/wolf and lately I've been trying to take out some modules as a library so that we don't keep re-inventing the wheel..

I've started with inputtino: a tiny library that builds on top of uinput (and uhid, more on this later) to provide implementations of different virtual devices. If you'd like, I'd be happy to collaborate to bring that to Moonshine as well!

I saw that you'd like to add support for gyro and acceleration and unfortunately that doesn't seem to be possible using uinput; I wrote a little rationale here if you are interested in the details. So I went down the rabbit hole and I've managed to properly emulate a DualSense joypad using uhid instead and I'm currently testing it via Sunshine on this PR.

Is this something that might interest you too?

@hgaiser
Copy link
Owner

hgaiser commented Mar 18, 2024

Hey, thanks for reaching out and Wolf sounds very cool and ambitious, kudos! I'm curious if the name is also derived from Moonlight, haha what a creative bunch we are.

I haven't looked deeply into force feedback and gyro events yet. Your input is greatly appreciated 👍 (pun intended). I thought I saw that I can register an evdev device with force feedback capabilities and listen for force feedback events. Wouldn't this allow rumble support? I haven't looked at gyro or trackpad support at all so I can't say much about that.

I do like the approach of letting hid-playstation or hid-sony pick up the device, it seems more realistic. I'm a bit more hesitant about the added permissions, though it seems these can be automated by packaging udev rules.

I also saw that Wolf is using gstreamer, not ffmpeg (or libav). I am looking into changing the pipeline to use PipeWire and the new Vulkan video encoding extensions, instead of NvFBC and NVENC. What are your experiences with this?

I do hope to look into inputtino and Wolf more in the future, but thought I'd already write this response.

@ABeltramo
Copy link
Author

I'm curious if the name is also derived from Moonlight, haha what a creative bunch we are.

Of course it is! Wolf is howling at the moonlight ! 😅

I thought I saw that I can register an evdev device with force feedback capabilities and listen for force feedback events. Wouldn't this allow rumble support?

It's all explained here
TL;DR: the issue is that gyro, acceleration and touchpad will end up being separate devices under /dev/input/event** and linking them up into a single joystick device is left to the consuming libraries. I've tested deeply SDL2 and the way they try to match sensors with joypads can't work with udev because it's missing the ability to set the uniq string for a device.
Rumble instead can work via udev and I have implemented it that way for Xbox and Nintendo virtual joypads.

I also saw that Wolf is using gstreamer, not ffmpeg (or libav). I am looking into changing the pipeline to use PipeWire and the new Vulkan video encoding extensions, instead of NvFBC and NVENC. What are your experiences with this?

There's been another recent project that uses the new Vulkan encoding APIs: https://github.com/colinmarc/magic-mirror . I believe that will be the future, and hopefully it'll allow us to massively simplify all our codebases for multiple GPUs and OSes; it seems to be unstable at the moment, but I'm curious to see where this goes.

As for our Gstreamer pipeline, we are in the lucky position where we don't need PipeWire or NvFBC: we run a custom Wayland compositor and directly grab the video framebuffer so that we can pass it as is to the encoding pipeline.
This way we can support multiple users and run even on a headless host.

I do hope to look into inputtino and Wolf more in the future

I'm drafting up a simple C API and some Rust bindings so that it would be easier to include it in non-C++ projects. If you'd be interested I can help you include inputtino into moonshine instead of re-implementing all this logic from scratch!

@hgaiser
Copy link
Owner

hgaiser commented Mar 20, 2024

It's all explained here TL;DR: the issue is that gyro, acceleration and touchpad will end up being separate devices under /dev/input/event** and linking them up into a single joystick device is left to the consuming libraries. I've tested deeply SDL2 and the way they try to match sensors with joypads can't work with udev because it's missing the ability to set the uniq string for a device. Rumble instead can work via udev and I have implemented it that way for Xbox and Nintendo virtual joypads.

Right, but this is how hid-playstation does it. Why couldn't we create a single event device using evdev to trigger all events (normal buttons + gyro + touchpad)? There's probably a reason this doesn't work, I'm curious what it is though.

There's been another recent project that uses the new Vulkan encoding APIs: https://github.com/colinmarc/magic-mirror . I believe that will be the future, and hopefully it'll allow us to massively simplify all our codebases for multiple GPUs and OSes; it seems to be unstable at the moment, but I'm curious to see where this goes.

As for our Gstreamer pipeline, we are in the lucky position where we don't need PipeWire or NvFBC: we run a custom Wayland compositor and directly grab the video framebuffer so that we can pass it as is to the encoding pipeline. This way we can support multiple users and run even on a headless host.

Yeap, I hope so too. How about the choice between ffmpeg and gstreamer? I picked ffmpeg simply because I could find more reference material for it. I'm curious if gstreamer would have been better, they seem to have better Rust support at the very least.

I'm drafting up a simple C API and some Rust bindings so that it would be easier to include it in non-C++ projects. If you'd be interested I can help you include inputtino into moonshine instead of re-implementing all this logic from scratch!

That would be very cool. I'd be interested in that for sure.

@ABeltramo
Copy link
Author

ABeltramo commented Mar 21, 2024

Right, but this is how hid-playstation does it. Why couldn't we create a single event device using evdev to trigger all events (normal buttons + gyro + touchpad)?

I'm not sure on the details of why they separated this is in 3 devices, my guess is that this is mainly for performance reason: gyro and acceleration generate a ton more events compared to joypad/touchpad; cramming them all together in a single device will probably hurt games that weren't meant to deal with that kind of noise.

On a separate note, you sure can create your own virtual device that spits out any kind of event but you can rest assured that no game/app will recognize your custom events. 😅 That's why I based all my tests on SDL, I know this doesn't cover 100% of usecases but there's a pretty large surface of gaming that uses SDL (Steam Input is based on that).

How about the choice between ffmpeg and gstreamer? I picked ffmpeg simply because I could find more reference material for it. I'm curious if gstreamer would have been better, they seem to have better Rust support at the very least.

I very much like the plugin system of Gstreamer, how you can easily compose pipelines and change/configure them completely via a simple string (which I have completely exposed in Wolf's config.toml so that users can deeply change the settings without having to manually build and compile).
The downside seems to be that it's a bit lagging behind feature wise compared to ffmpeg: newer things tend to be implemented first in ffmpeg. I personally wouldn't go back to ffmpeg given how clean my code is for dealing with all the GPUs/Drivers/Encoders. For example I added support for AV1 without having to write any code apart from the required Moonlight protocol bits.

@hgaiser
Copy link
Owner

hgaiser commented Mar 21, 2024

Thanks for the input. I will definitely be looking at gstreamer some more. They already have a PR for Vulkan Video Extensions, which I am very much looking forward to.

Regarding the controller part, let me know what I can do, or how you are progressing :).

@ABeltramo
Copy link
Author

I'll probably put something together over the weekend; I'll ping you once it's ready!

@ABeltramo
Copy link
Author

This took me longer than expected (as usual). I've added a common C ABI and I've finally managed to get Rust to compile and be able to call that API; you can see the code in here https://github.com/games-on-whales/inputtino/tree/stable/bindings/rust

I'll wrap the methods in Rust so that you'll get a nicer API to use..

@hgaiser
Copy link
Owner

hgaiser commented Mar 24, 2024

No worries, I am short on time to look into improving input handling at the moment anyway. Seems you are making good progress!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants