SightLab supports face tracking on any headset that uses the OpenXR standard, including Meta (Quest Pro, Quest 3) and HTC Vive Focus Vision and Vive Focus 3 with the Face Tracker add-on
Adding Face Tracking to Your Script
To log facial tracking data, import the relevant module:
# For Meta/Questfromsightlab_utilsimportface_tracker_data# For HTC headsetsfromsightlab_utilsimportface_tracker_data_htc
This module automatically saves face tracking data to your experiment’s /data folder.
Built-In Face Tracking Templates & Examples
SightLab includes several example scripts for leveraging facial tracking:
Mirror Demo
Use this to map facial expressions to an animated avatar.
➤ Edit or remove the mirror if you only need the avatar mapping.
FaceTracker_Sliders
Demonstrates how facial expressions can drive GUI sliders in real time.
Facial_Expressions_Over_Time
Visualizes expression data with a matplotlib chart. Can load data from any face tracking-enabled trial.
Example Script
See Face_Tracking_Saving or Face_Tracking_Saving_HTC in the Face_Tracking_Data folder in ExampleScripts
fromsightlab_utilsimportface_tracker_dataimportsightlab_utils.sightlabasslfromsightlab_utils.settingsimport*sightlab=sl.SightLab()# Set up with custom valuesface_tracker_data.setup()defsightLabExperiment():whileTrue:yieldviztask.waitKeyDown(" ")# Start the update processvizact.ontimer(0,face_tracker_data.UpdateAvatarFace)yieldsightlab.startTrial()yieldviztask.waitKeyDown(" ")yieldsightlab.endTrial()viztask.schedule(sightlab.experiment)viztask.schedule(sightLabExperiment)
Face Tracking Expression Parameters
Here are the parameters available for Face Tracking Expression: