- cross-posted to:
- technology
- hackernews
- [email protected]
- cross-posted to:
- technology
- hackernews
- [email protected]
cross-posted from: https://lemmy.zip/post/22604748
The Vision Pro uses 3D avatars on calls and for streaming. These researchers used eye tracking to work out the passwords and PINs people typed with their avatars.
Archived version: https://web.archive.org/web/20240912100207/https://www.wired.com/story/apple-vision-pro-persona-eye-tracking-spy-typing/
Couldn’t you theoretically do the same thing by tracking someone’s eye movements on video chat, if they look at their keyboard while typing?
you’d have to move your eyes from letter to letter, like Vision Pro users
Maybe but I’m guessing most cameras don’t have as high of a res of your pupil?
Yes and no, it’s not really as accurate, 1 - if the guy do not watch his keyboard at all. 2 - if the guy is watching a bit his keyboard but only to the approximate place of the letter and remember the position after. BUT this could be counter by training an AI to extrapolate the results to get something more precise