Hacker News new | past | comments | ask | show | jobs | submit login

How well the new user interaction method works (eye and hand tracking) was the thing I was most interested in.

> I’ve used essentially every major VR headset and AR device since 2013’s Oculus DK1 right up through the latest generations of Quest and Vive headsets. I’ve tried all of the experiences and stabs at making fetch happen when it comes to XR.

Here’s what Apple got right that other headsets just couldn’t nail down:

The eye tracking and gesture control is near perfect.

https://techcrunch.com/2023/06/05/first-impressions-yes-appl...




But how precise is it? Eye tracking may be good enough for foveated rendering, hand tracking good enough for pushing buttons, but how does it compare to handheld controllers for drawing something in Photoshop?


Marques Brownlee picks out the quality of the eye tracking as a standout feature.

https://www.youtube.com/watch?v=OFvXuyITwBI


It depends on task though. Selection is easy to do using eye-tracking as you just have to shift your attention onto the object long enough to click.

Drawing is a task that is more like aiming - attempting to hold your gaze at a precise location for a period of time. It is possible, but you are fighting against the natural saccade pattern of your eye and it becomes tiring very quickly. You can expect fatigue / eye-strain within a few minutes.


> Drawing is a task that is more like aiming

A new platform certainly doesn't have to be better at every task.

Even 20 years ago, a professional digital illustrator wouldn't try to draw everything on screen with a mouse, they would have a tool more suited to the task, like a Wacom tablet.


Agreed I would want to do something like drawing with the eye tracking. I would expect the hand tracking would be used for that.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: