Quick question for any experts reading this - do Oculus or VIVE use any sort of dead reckoning/movement prediction in their tracking? Also does anyone have any documentation on the APIs for this, or information on how the devices keep track of their latency and calibration information?
There are fundamental limits on latency, especially with spread-spectrum transmission when all of this goes wireless. As accurate as the tracking and pointing are for controllers, I feel like some additional extrapolation is happening. It would be great to have an open source library for this so we can give hand-built rigs the best tracking that's mathematically possible.
>Quick question for any experts reading this - do Oculus or VIVE use any sort of dead reckoning/movement prediction in their tracking? Also does anyone have any documentation on the APIs for this, or information on how the devices keep track of their latency and calibration information?
Absolutely. Vive uses a combination of IMU based dead reckoning combined with Lighthouse sensors to provide tracking. The dead reckoning is super important for maintaining tracking during sensor occlusion. The API it interfaces with is SteamVR, which is mostly open source, so you can even see how they’re doing it. The new generation Vive Pro will combine this along with stereo camera CV based inside out tracking for even better precision.
This is pure speculation on my part, but it’s the only concievable use for the cameras. They are laid out in the exact same way as the Samsung Odyssey headset which does that. I cant imagine they have solved the compositing issues involved with doing pass through AR yet, although I’d be impressed if that’s the case.
Yes it uses both, relative and absolute measurements (each with its own drawbacks) into what's usually called sensor fusion. It's very well explained here: http://doc-ok.org/?p=1478
I know the Vive does some kind of motion prediction for its controllers, at least in the case they lose tracking: if you quickly move a controller out of view of the lighthouses (kind of hard to do if you have the lighthouses set up well; I had to hide the controller under my shirt) then the system will show the controller continuing to move in the direction it was moving for a short bit.
Quick question for any experts reading this - do Oculus or VIVE use any sort of dead reckoning/movement prediction in their tracking? Also does anyone have any documentation on the APIs for this, or information on how the devices keep track of their latency and calibration information?
There are fundamental limits on latency, especially with spread-spectrum transmission when all of this goes wireless. As accurate as the tracking and pointing are for controllers, I feel like some additional extrapolation is happening. It would be great to have an open source library for this so we can give hand-built rigs the best tracking that's mathematically possible.