Skip to content

Common Metrics

Common Metrics Tracked

List of some of the common metrics that SightLab can collect and visualize:

1. Core Eye Tracking Metrics

The SightLab VR Pro plugin for Vizard collects a wide range of detailed metrics, including gaze data, fixations, saccades, and physiological correlates. Below is a structured breakdown of these metrics.

Note that the data collected can be easily added to with minimal effort (for instance adding heart rate, hand position, etc.):

A. Fundamental Eye Metrics

  1. Timestamp – Records the exact time of each data entry.
  2. Trial Label – Identifies the trial during which the data was collected.
  3. Gaze Intersection (Combined x,y,z) – The intersection point of the combined gaze.
  4. Left & Right Eye Gaze Intersection – Individual gaze intersection points for each eye.
  5. Eye Euler Angles (Combined) – Orientation of the gaze in Euler angles (Yaw, Pitch, Roll).
  6. Eye Euler Angles (Left & Right) – Separate Euler angles for each eye.
  7. Head Position (6DOF) – Position and orientation of the head in six degrees of freedom.
  8. Fixation / Saccade State – Whether the gaze is fixated or in a saccade movement.
  9. Saccade Amplitude – The distance covered by the eye during a saccade.
  10. Saccade Velocity – The speed at which the eye moves during a saccade.
  11. Average Saccade Amplitude – Mean amplitude of saccades over a trial.
  12. Peak Saccade Velocity – Maximum speed recorded during a saccade.
  13. Pupil Diameter – Measurement of pupil size (device dependent).
  14. Eye Openness – Determines if the eye is fully open, partially closed, or closed (device dependent).
  15. Dwell Time – The duration gaze remains on an object or region.
  16. View Count – The number of times an object/region has been viewed.
  17. Fixation Count – The number of fixations on an object/region.
  18. Average View Time – The mean duration of views per object/region.
  19. Time to First Fixation – Time taken to fixate on an object/region after stimulus onset.
  20. Condition / Performance Data – Custom experimental data related to trial conditions.
  21. Custom Flags – User-defined markers for events or states during the experiment.

B. Object-Specific Gaze Data

  • Gaze event count per object – Number of gaze interactions per object (view count).
  • Dwell time per object – How long gaze is maintained on an object.
  • Fixations per dwell – Number of fixations occurring while dwelling on an object.
  • Total gaze duration per object – Sum of all gaze durations for an object (total view count).
  • Average gaze duration per object – Mean duration of gaze events per object (average view count).
  • Raw gaze data access – Includes gaze vector transformations, Euler angles, and positions of gaze points.
  • Fixations & Saccades detection – Based on dispersion method (angular distance and time threshold).

C. Advanced Eye Tracking Analysis

  • Fixation Sequence Analysis – The order in which different areas of interest (AOIs) are fixated.
  • AOI (Area of Interest) Analysis – Defining regions of interest and analyzing gaze behavior.
  • Gaze Contingent Display – Changing the display based on gaze position for dynamic experiments.

2. Visualization and Playback

SightLab VR Pro supports multiple visualization methods for gaze data, aiding in analysis and interpretation in an interactive replay:

A. Visualizations

  1. Heatmaps – Represents intensity of gaze on different areas.
  2. Scan Paths – Shows gaze movement sequences and fixations.
  3. Walk Paths – Displays movement along with gaze direction and velocity.
  4. Interactive Playback – Allows replay of gaze behavior in VR.
  5. Fixation & Dwell Spheres – Highlights fixations and dwell points in 3D space.
  6. Gaze Point Positioning – Shows precise gaze positions in the virtual environment.
  7. Multi-User Interactions – Enables analysis of multiple participants' gaze data in collaborative VR.
  8. Barcharts of view data (total view counts, average view count, etc.)
  9. Comparison graphs of independent vs. dependent variables
  10. Additional graphs and charts via. matplotlib, seaborn, and plotly (scatterplots, histograms, box plots, etc.)
  11. Real time data visualization via biofeedback or other data sources

3. Physiological Correlates

For integrated biometric research, SightLab supports physiological data collection when used with BIOPAC:

Examples (not a full list)

  • Electrodermal Activity (EDA) – Measures skin conductance response.
  • Heart Rate (HR) – Tracks cardiovascular responses.
  • Electroencephalography (EEG) – Captures brain activity.
  • Functional Near-Infrared Spectroscopy (fNIRS) – Measures cortical hemodynamic responses.

4. Additional Data

Hardware Dependent Body Position

  • Full body position
  • Finger position and angle
  • Face Tracking (including facial expression output)

Additional Optional Tracking:

  • Audio Recording and transcription

Experiment Metrics and More

  • Experiment management features- tracking the current trial number, managing participant data, and experiment timelines
  • Proximity-Based Interactions
  • User Feedback through survey and rating tools
  • User inputs and demographics
  • Button clicks and interactions
  • Grab and release events
  • Scene Objec
  • Data Analysis Tools to measure modified (independent) vs. measured (dependent) variables