Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to expose Eye Tracking? #25

Closed
blairmacintyre opened this issue Sep 11, 2018 · 8 comments
Closed

How to expose Eye Tracking? #25

blairmacintyre opened this issue Sep 11, 2018 · 8 comments

Comments

@blairmacintyre
Copy link

We're finally seeing devices with integrated eye tracking (e.g., the Magic Leap 1) instead of having eye tracking only be a third party add-on to VR displays.

How should we expose these? Is this something that should be exposed automatically (e.g., akin to head pose and controllers) or require an app request (with opportunities for user permissions)? Is automatic exposure a privacy/security risk (feels like it could be, if Bad Actor[tm] can tell where you are looking).

@cwilso
Copy link
Member

cwilso commented Sep 11, 2018

Hmm, at the very least seems like there might be the desire to have a separate permission.

@peterclemenko
Copy link

From a security perspective, automatic exposure would be a privacy risk and would allow additional fingerprinting/deanonymozation of a target user. Any/all biometric data gathered, including eye tracking, should be treated as a security/privacy risk. I would suggest having it be another permission. From an analytics perspective, eye tracking can help profile a user's psyche (marketing people love eye tracking for a reason) and fine grained enough could help deanon a user in other ways as well.

On the bright side though, this might also be usable to expose things like Tobii eye tracking devices as well.

@blairmacintyre
Copy link
Author

I tend to agree, but didn't want to pre-suppose this in my question too much. :)

In some ways, head and controller motion has similar threats of profiling. I am pretty sure I've seen talks where people showed they can detect tremors before the user notices them (for example) for early Parkinsons. And where we can do gait detection to ID people from similar devices (not just video).

This is one of the reasons why I keep pushing for explicit, informed user-consent (not just "user action") before giving any XR data to an app.

@rcabanier
Copy link

We'd like to explore this a bit further.
Since gaze is already exposed, would it be that more of a privacy concern to have support for eye fixation?

@blairmacintyre
Copy link
Author

How is gaze already exposed? Do you mean head-orientation-style-"gaze" (which isn't really a persons gaze).

Regardess, I would like to see this explored.

@rcabanier
Copy link

yes, it's not the actual gaze, but it still reveals where the person is looking. Maybe even it a higher certainty than the eye position since it's more stable.

@blairmacintyre
Copy link
Author

We should move the discussion over to a new repo, I think, but I'll point out that I don't agree with your comment, at all.

Head direction reveals at a gross level the general direction the user is facing; it says almost nothing about what they are looking at. Current AR HMDs (Hololens, ML1, ODG, etc) have such small FOVs that there is some correlation between head pose and user gaze because it's almost impossible to see anything without turning your head to face it.

However, no display with such restrictive FOVs really represents a useful consumer device; they are fine for experimenting and doing development, and will certainly be useful for enterprise users and enthusiasts. But we shouldn't create APIs based on their characteristics.

When in not wearing and HMD (i.e., when just looking around in the world, watching TV or reading a book or whatever), or perhaps in VR if you're using one of the wider FOV VR displays, our head gave really doesn't say too much about where our eyes are looking; we don't move our heads to keep what we're looking at aligned with our head's view direction. Certainly, we may not notice or ever look directly at things that are in our field of view.

Beyond this, the privacy risks of eye tracking are not just based on "what the user is looking at", but also on the biometric fingerprint that can be generated based on the user's eye movements, inferences that can be made, etc.

I look forward to using AR apps that can leverage eye gaze; but, I don't want all applications to have access.

I also wonder if we might provide reduced / filtered information to applications (e.g., only provide when there is some non-trivial dwell time) and so on. I'm not sure (at all) what the tradeoffs are here, but a question will be "what will apps really want".

@TrevorFSmith
Copy link
Contributor

It seems like forward motion on this Issue has stalled so I'm going to close it. If you have a plan for how to make progress then ping me and we can re-open it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants