-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mobile / touch support #462
Comments
In case anybody else needs it, my current implementation can be found here. It supports both one-finger and two-finger interaction, however is currently limited to viewport manipulation. You can try it out here. In case the maintainers of this project are interested, I would be willing to (at least try to) provide a PR for this feature. However, due to my limited knowledge of this library, I would need some guidance regarding how I shall implement it. In addition to the open questions above, I would need to know
|
Great, thanks! I think a contribution would be awesome. My first thought is let's make a dedicated TouchTool/TouchListener mechanism along the lines of the existing mouse support. The most important question I have about it is: would the two mechanisms interfere with each other? Meaning that if I do something on a touch device, does that trigger both a TouchEvent and a MouseEvent? |
Sadly yes. However not for all stuff. For example, if you touch a single point, you get There are multiple options how this could be handled
One more thing: having it both in the same listener would probably make some features easier to implement, as logic can be shared more easily. For example, if move should be supported by touch, one would need to duplicate lots of code, or move code to a "helper class" which handles the core logic. However, this again would be a breaking change, as all these helper methods in MoveMouseListener are protected and not private. |
I see. I can imagine that it would still make sense to have a separate TouchTool/TouchListener mechanism and handle the interference with the mouse events specifically in those listeners where we want to support both ways. By extracting interfaces |
(just fyi, I did not forget this, however I will be busy the next two weeks, so the PR will take some time, sry) |
Currently, to my knowledge, there is no mobile / touch event support (well mouseDown works, but that's not what I mean).
I think that in particular for viewport manipulation (zoom and scroll), this could be a useful feature. There might be other features which could benefit from this, too (e.g. move).
Some open questions I had when I thought a bit about the feature
Maybe it would be an option to - as a first step - add support for touch events in general (either using MouseTool, or using a dedicated TouchTool). For the time being, I know (and am very thankful for) that sprotty is flexible enough that I can implement this myself for my use case, I just thought that maybe others would benefit from such a feature too.
The text was updated successfully, but these errors were encountered: