OpenXR
See this page in the Vizard documentation here for more information on using OpenXR.
For help in connecting the Meta Quest Pro, see this article.
For help in connecting Varjo headsets, see this tutorial.
For additional OpenXR devices, see the Vizard documentation linked above.
Examples can be found in ExampleScripts
for:
-
Mixed Reality
-
Meta_Pro_Full_Body
-
Hand_Tracking_Grabbing_Physics
-
Face Tracking
-
Mirror demo
-
(select Meta Quest Pro, Quest 3, Varjo, OpenXR (EyeTR or non EyeTR), Vive Focus 3 OpenXR', or Omnicept OpenXR from the device list)
Eye Tracking
To test eye tracking, select either 'Meta Quest Pro', 'Varjo', or 'Vive Focus 3 OpenXR' from the hardware dropdown.
Hand Tracking
As of version 1.9.8, add from sightlab_utils import handTrackingOpenXR
to the top of your script. You can also select "Meta Pro Hand."
Mixed Reality
To test Meta Quest Pro or Varjo mixed reality, see examples in the Mixed Reality folder.
Upper Body Tracking
To test Upper Body tracking, refer to the Meta_Pro_Full_Body
folder or import body_tracking
from sightlab_utils
into any script. You can also check out the virtual mirror demo. See this page for more information.
Face Tracking
To test face tracking, see the virtual mirror demo or the FaceTrackingData
in ExampleScripts
, which will show sliders for all 60 data points on the face. You can also add saving of face tracking data to any script. See this page for instructions.
Features and Examples
Mirror Example | Mixed Reality | Upper Body Tracking |
- Facial Tracking, Upper Body Tracking, Hand Tracking | - Mixed Reality object viewer, virtual screen, adding to any project | - Adding Upper Body Tracking to any scene |
Hand Grabbing and Physics | Face Tracking | Eye Tracking |
- | - See various ways to save, view, and analyze face tracking data | - Analyze gaze data and track eye movements |
For more details, refer to the Vizard section on OpenXR.