Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mobile / touch support #462

Closed
nk-coding opened this issue Sep 4, 2024 · 5 comments · Fixed by #475
Closed

Mobile / touch support #462

nk-coding opened this issue Sep 4, 2024 · 5 comments · Fixed by #475
Labels
discussion Should be figured out together

Comments

@nk-coding
Copy link
Contributor

Currently, to my knowledge, there is no mobile / touch event support (well mouseDown works, but that's not what I mean).
I think that in particular for viewport manipulation (zoom and scroll), this could be a useful feature. There might be other features which could benefit from this, too (e.g. move).
Some open questions I had when I thought a bit about the feature

  • should a one-finger or a two-finger gesture be used for the scroll/pan feature
  • when zooming using a pinch gesture, is it allowed to simultaneously scroll the canvas

Maybe it would be an option to - as a first step - add support for touch events in general (either using MouseTool, or using a dedicated TouchTool). For the time being, I know (and am very thankful for) that sprotty is flexible enough that I can implement this myself for my use case, I just thought that maybe others would benefit from such a feature too.

@nk-coding
Copy link
Contributor Author

In case anybody else needs it, my current implementation can be found here. It supports both one-finger and two-finger interaction, however is currently limited to viewport manipulation. You can try it out here.

In case the maintainers of this project are interested, I would be willing to (at least try to) provide a PR for this feature. However, due to my limited knowledge of this library, I would need some guidance regarding how I shall implement it. In addition to the open questions above, I would need to know

  • if touch support should be added to MouseTool/MouseListener, or if an equivalent TouchTool/TouchListener should be added
  • if scrollbars should be supported
  • if the move feature should be supported
  • what configuration options should be supported

@spoenemann spoenemann added the discussion Should be figured out together label Sep 25, 2024
@spoenemann
Copy link
Contributor

Great, thanks! I think a contribution would be awesome.

My first thought is let's make a dedicated TouchTool/TouchListener mechanism along the lines of the existing mouse support. The most important question I have about it is: would the two mechanisms interfere with each other? Meaning that if I do something on a touch device, does that trigger both a TouchEvent and a MouseEvent?

@nk-coding
Copy link
Contributor Author

The most important question I have about it is: would the two mechanisms interfere with each other? Meaning that if I do something on a touch device, does that trigger both a TouchEvent and a MouseEvent?

Sadly yes. However not for all stuff. For example, if you touch a single point, you get [touchstart, touchend, mouseover, mouseenter, mousemove, mousedown, mouseup]. However, if you move your finger, you only get touchmove multiple times, but not mousemove

There are multiple options how this could be handled

  • ignore touch-caused mouse events in MouseTool (would be breaking change, don't like this)
  • ignore touch-caused mouse events in some MouseListeners if necessary (in most cases I don't think it is, e.g. selection works just fine, no need for manual touch support there)

One more thing: having it both in the same listener would probably make some features easier to implement, as logic can be shared more easily. For example, if move should be supported by touch, one would need to duplicate lots of code, or move code to a "helper class" which handles the core logic. However, this again would be a breaking change, as all these helper methods in MoveMouseListener are protected and not private.

@spoenemann
Copy link
Contributor

I see. I can imagine that it would still make sense to have a separate TouchTool/TouchListener mechanism and handle the interference with the mouse events specifically in those listeners where we want to support both ways. By extracting interfaces IMouseListener and ITouchListener, it's possible to have a listener class that supports both and is registered twice in the dependency injection.

@nk-coding
Copy link
Contributor Author

(just fyi, I did not forget this, however I will be busy the next two weeks, so the PR will take some time, sry)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion Should be figured out together
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants