Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is really cool.

Quick question for any experts reading this - do Oculus or VIVE use any sort of dead reckoning/movement prediction in their tracking? Also does anyone have any documentation on the APIs for this, or information on how the devices keep track of their latency and calibration information?

There are fundamental limits on latency, especially with spread-spectrum transmission when all of this goes wireless. As accurate as the tracking and pointing are for controllers, I feel like some additional extrapolation is happening. It would be great to have an open source library for this so we can give hand-built rigs the best tracking that's mathematically possible.



>Quick question for any experts reading this - do Oculus or VIVE use any sort of dead reckoning/movement prediction in their tracking? Also does anyone have any documentation on the APIs for this, or information on how the devices keep track of their latency and calibration information?

Absolutely. Vive uses a combination of IMU based dead reckoning combined with Lighthouse sensors to provide tracking. The dead reckoning is super important for maintaining tracking during sensor occlusion. The API it interfaces with is SteamVR, which is mostly open source, so you can even see how they’re doing it. The new generation Vive Pro will combine this along with stereo camera CV based inside out tracking for even better precision.


Do you have a source on the stereo cameras of the Vive Pro being used for inside out tracking?


This is pure speculation on my part, but it’s the only concievable use for the cameras. They are laid out in the exact same way as the Samsung Odyssey headset which does that. I cant imagine they have solved the compositing issues involved with doing pass through AR yet, although I’d be impressed if that’s the case.


> They are laid out in the exact same way as the Samsung Odyssey headset

Look again. The Odyssey's cameras sit below eye level and point down & sideways, which makes sense for tracking:

https://www.samsung.com/us/computing/hmd/windows-mixed-reali...

The Vive Pro cameras sit at eye level, at average interpupillary distance, and point straight ahead:

http://s3.amazonaws.com/digitaltrends-uploads-prod/2018/01/h...

That makes most sense for pass-through AR.


Yes it uses both, relative and absolute measurements (each with its own drawbacks) into what's usually called sensor fusion. It's very well explained here: http://doc-ok.org/?p=1478

The headset is not wireless and the controllers have 1-2 ms of latency. They compress the controller data a lot in a very smart way. More info: https://hackaday.com/2016/12/12/cnlohr-reverses-vive-valve-e...


I know the Vive does some kind of motion prediction for its controllers, at least in the case they lose tracking: if you quickly move a controller out of view of the lighthouses (kind of hard to do if you have the lighthouses set up well; I had to hide the controller under my shirt) then the system will show the controller continuing to move in the direction it was moving for a short bit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: