Hacker News new | past | comments | ask | show | jobs | submit login

This is really cool, the hardware for this is far in advance of what I was using. I was using the classic webcam type setup, with the 1/4 inch accuracy you elluded to in your post.

My thesis focused more on the UI components than the hardware. I had my gaze setup to appear as a cursor input, and used some CSS to hide the cursor on a webpage. Then I used hover effects to open menus, focus inputs etc.

My question was, could it replace the mouse on a desktop? And I wanted to build something for anyone to use, not as an accessibility input. I used eye gaze with the spacebar on the keyboard as a primary input action.

The components had large targets, so the accuracy didn't matter too much. I used some basic components to build an email UI, which worked purely through gaze and the keyboard.

The UI was perfectly functional, but it really drained me/others to use it. Possible it could be due to accuracy, or some of the UI component design. My gut feeling was that the UI reacting to my eyes was the real problem though, there was a strange feeling knowing that you needed to look at certain components to use them. The way your eye works, it wants to jump around to whatever is interesting, and having a UI that needs the eye to look at a certain place isn't pleasant.

I think any UI that wants to utilize the eye would have to be very subtle, and designed not to feel restrictive to your focus. I'm not convinced that the mouse/your fingers could be replaced by eye tracking but for rendering higher res in VR goggles and that sort of thing makes a lot of sense.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: