The External Application Data Recorder enables you to record, save, and synchronize eye and sensor data while running external VR applications, including SteamVR games, Unity or Unreal applications, Oculus apps, web-based VR experiences, or stand-alone/Android based applications (Note: stand alone doesn't include eye tracking). After a session you can run the replay and view the gaze point synchronized with the data. There is an experimental mode to use AI to look for view counts.
At‑a‑Glance
Platforms: Unreal, Unity, SteamVR/OpenXR PC apps; desktop apps; web; standalone/Android via casting (head pose only).
Headsets: Vive Focus Vision, Vive Focus 3, Vive Pro Eye (use Vive Focus 3 Recorder), Meta Quest Pro / 3 / 3S / 2 (no ET on 2/3/3S), Varjo XR‑3/XR‑4, HP Omnicept, plus generic SteamVR/OpenXR (results may vary), Quest 3 w/Pupil Labs
Screen‑based Eyetrackers: Tobii, EyeLogic.
Outputs: CSV logs (gaze, fixations/saccades, events, custom markers, face tracking and much more), video with gaze overlay, Biopac AcqKnowledge markers. Replay with scan paths, fixation spheres, heatmaps. See here for full list.
SRanipal/SteamVR for Vive Focus Vision / Focus 3 / Pro Eye: install Vive Console for SteamVR.
For FFMPEG open a CMD window and enter winget install ffmpeg
How to Run
Setup Hardware and Launch External Application: Make sure your hardware/headset are connected and running and launch External Application you want to record data with (i.e. Unreal, Unity, SteamVR, etc.)
Launch the External_Application_Data_Recorder.py file to get the recorder ready
Choose the Window Title to capture.
Select recording length (seconds). You can end early by closing the window or pressing Space.
Choose your hardware
Put on the headset and press Spacebar (default; configurable). If connected to Biopac, AcqKnowledge transport starts. On supported HMDs you’ll see a live gaze point on the mirrored window.
Finish & save: When the timer ends you’ll hear a beep. Data goes to the /data folder.
View a replay with visualizations by launching the External_Application_Data_Replay.py file and choose your session. Note: If the replay crashes, you may need to run the convert_video_to_fix.py and convert the video.
Use the scrubber in the replay to scrub through the playback. Press the 1 and 2 keys to synchronize moving the scrubber in Acqknowledge if using version 6.02 or higher. See this page for more information and controls for the Replay.
Toggle visualizations such as scan path, fixation spheres and the heatmap in the replay. Note: You may need to toggle the items first to see the visualizations. Visualizations other than gaze point not available for Stand Alone Android Based Applications.
Press the "4" key in the replay to record a video with the gaze point (will need to let this play through in real time). Press "5" to stop. This will save in the "replay_recordings" folder. Open the recorded video in AcqKnowledge to see the synchronization with physiological data. See here for how to do that.
)
Can run the Experimental AOI Tracker Tool to tag some objects or regions to follow (see below)
Biopac AcqKnowledge markers & sync; AcqKnowledge playback can be driven by the SightLab Replay slider (6.02+)
Note: Standalone/Android via casting records head pose only; no eye vector or object intersections.
Running Standalone/Android via Casting (Meta)
On the headset: Settings → Camera → Cast.
On PC: open https://www.oculus.com/casting/, press F11 for fullscreen.
In the Recorder, choose Stand-Alone ("Desktop" if running an older version without that option) as the hardware mode and select the browser cast window.
Replay will show a virtual screen; data syncs with AcqKnowledge if connected.
Configuration Settings
# ----- Biopac & Network Sync -----BIOPAC_ON=True# Communicate with Biopac AcqknowledgeLOCK_TRANSPORT=True# Lock the transportNETWORK_SYNC_KEY='t'# Key to send event marker to AcqknowledgeNETWORK_SYNC_EVENT='triggerPress'# Event trigger for markerUSE_NETWORK_EVENT=False# Send network event to external appNETWORK_START=False# if True, use a network event to start (instead of START_END_SESSION_KEY)# Network host/port and JSON event names (must match Unity/Unreal sender)NETWORK_HOST='localhost'NETWORK_PORT=4950NETWORK_START_EVENT_NAME='start_trial'# ----- Data Recording -----RECORD_VIDEO=True# Enable video recordingRECORD_VIDEO_OF_PLAYBACK=True# Record during replayRECORD_GAZE_MAT=True# Save gaze matrix dataRECORD_FACE_TRACKER_DATA=False# Save facial expression dataOLDER_SIGHTLAB_PID=False# Use older participant ID format# ----- Video Recording Method -----# "OPENCV" = Compressed videos (~50MB/min), may crash after ~10min# "VIZARD_BUILT_IN" = Uncompressed (~400MB/min), stable for hours, records Vizard window only# "SIGHTLAB_BUILT_IN" = Vizard built in with compression - Can take some time to compress# "IMAGEIO_FFMPEG" = External window capture via imageio-ffmpeg, may crash after ~10min# "FFMPEG" = Direct FFmpeg subprocess, most stable for very long recordings. Requires FFMPEG SCREEN_RECORDER_TYPE="IMAGEIO_FFMPEG"# ----- Video Quality Settings -----VIDEO_RECORDING_WINDOW_HEIGHT_NEW='1920'VIDEO_RECORDING_WINDOW_WIDTH_NEW='1080'# ----- Timer & Session Control -----USE_TIMER=True# Use timer instead of keypress to end trialUSE_TIMER_DROPDOWN=True# Show dropdown to select timer lengthDEFAULT_TIMER_LENGTH=10# Default timer length (seconds)START_END_SESSION_KEY=' '# Spacebar to start/stop trialPLAY_END_SOUND=True# Play sound at end of trialTRIAL_CONDITION='A'# Default trial condition labelSET_NUMBER_OF_TRIALS=1# ============================================================================# REAL-TIME STREAMING SETTINGS (During Recording)# ============================================================================REAL_TIME_STREAMING=True# Show live window capture in VR during recording# ----- Texture Alignment (if video appears offset) -----# Use keyboard controls to adjust, then copy values here:# Arrow Keys: Move texture | Numpad 4/6/8/2: Scale texture# P: Print values | R: Reset to defaultsTEXTURE_OFFSET_X=0.0# Horizontal offsetTEXTURE_OFFSET_Y=0.0# Vertical offset TEXTURE_SCALE_X=1.0# Horizontal scaleTEXTURE_SCALE_Y=1.0# Vertical scaleTEXTURE_ADJUSTMENT_STEP=0.05# Adjustment increment per keypress# ============================================================================# REPLAY SETTINGS (During Playback)# ============================================================================HIDE_REPLAY_GUI=False# Hide SightLab's replay GUIFOLLOW_ON=True# Enable first-person view in replayREPLAY_SECOND_GAZE_POINT_OFF=True# Default Trueoverride_screen_position=True# Flag to control whether we override replay positionoverride_real_time_position=False# Default False# ----- Screen Position & Size -----# Adjust these if the replay video appears misaligned or wrong size# Use keyboard controls during replay to find perfect values:# Arrow Keys: Move | PgUp/PgDn: Depth | +/-: Width | [/]: Height# K: Print values | L: Reset | O: Toggle override# ----- Viewpoint Adjustment -----# Fine-tune camera position/rotation for optimal replay viewing# ============================================================================# PROFILE MAPPING AND PRESET CONFIGURATIONS# ============================================================================SeetheConfigfileforthefullprofilemappinglistyoucanadjust
When to switch recorder type
• VIZARD_BUILT_IN: rock‑solid for long runs (uncompressed, big files).
• IMAGEIO_FFMPEG/OPENCV: smaller files, may be less stable after ~10 min.
• SIGHTLAB_BUILT_IN: Vizard built‑in with compression.
Scrub: hotkeys B/N or C/V to step; drag slider to set time (video frame jumps follow hotkeys)
Record gaze video: 4 start / 5 stop → saves to /replay_recordings
Auto AI View Detection
Auto AI View Detection (Experimental)
Record & Replay
Use the recorder as normal → in replay press 4/5 to export a video with gaze (saved under /replay_recordings).
Track ROIs (optional)
Run **AOI_Tracker_Tool**:
Space/P pause/resume · s select ROI (while paused) · Enter/Space confirm ROI · c/v step back/forward · q export
Outputs: tracked_multiple.avi, tracked_boxes.csv, preview video to roi_videos_data.
Extract frames
Run convert video to images.py to generate frame_XXXX.png files.
Analyze
Run Auto_AI_ROI_View_Detection.py(orAuto_AI_View_Detection.py). Results saved as openai_response_<date_time>.txt; follow‑ups via Follow_Up_Questions.py.
Requires openai key:
Keys & Tokens
Set API keys globally (Windows cmd): setx OPENAI_API_KEY "your-key"
Restart Vizard after setting. AI image analysis can be token‑heavy; consider sampling every N frames and using a dwell threshold (e.g., ≥15 frames at 30 fps ≈ 0.5 s).
Network Events with External Applications
The external application data recorder can be controlled via an external application (such as starting the trial with a signal from the other application), send triggers to external applications or send additional information back and forth. For information on how this works see this page.
Additional Extended Features
Rating/Likert Scales & Surveys
Easily collect participant feedback. Customize scale labels and capture responses programmatically and in data exports. Ratings must be collected before or after the external session.
Inputs/Demographics
Gather participant data (e.g., age, ID, gender) before starting the external session.
Adding a Label/Condition
Tag sessions with experimental conditions for sorting and analysis.
Flags, Network Events, Button Clicks
Enable logging of custom triggers (e.g., spacebar presses, network signals) during the session for synchronized event tracking.
External apps can send JSON UDP (e.g.,{"event":"start_trial"}or{"event":"sync"}) to trigger local actions like syncEvent()or just be logged. Make sure the external app is sending UTF-8 JSON over UDP to the same NETWORK_HOST/NETWORK_PORT
Speech Recording(optional)
Record microphone input for later analysis or transcription.
Transcriptions
Combine mic recordings with post-session transcription tools to create searchable dialogue data.
Instructions
Show instructions or display guidance on the mirrored desktop before launching the external app.
Plotly for Additional Data Analysis
Replay session data with built-in Plotly tools to visualize gaze, movement, and behavioral metrics
Face Tracking and Expression Analysis
Automatically capture facial expressions with supported headsets (e.g., Meta Quest Pro) if enabled in the config.
Average Physiological Data
Biopac integration allows tracking and averaging of heart rate, skin conductance, and cognitive load throughout the session.
Baseline
Record a short “resting” or neutral task before launching the external app to establish baseline physiological readings.
Biofeedback Ball
Display a 3d object that responds to physiological data streams.
Lab Streaming Layer
Connect to additional devices via Lab Streaming Layer.
Limitations and Tips
Can also use SteamVR to run the application and then use “Display VR View” to mirror the window and use that as the window to run the application on (Note: Don't use VR View for Real Time Streaming as it will override the view). This may be necessary for Meta Quest Pro and SteamVR
Replay has a screen that is in a weird looking aspect ratio or colors or off: Install K-lite codec pack from https://codecguide.com/download_kl.htm
May have weird results if headset eye tracking is not calibrated or head direction is off.
Face tracking data can be found in the data folder and visualized using the facial_expressions_over_time.py script.
Z‑depth caveat: Gaze point in external apps hits a screen‑space collider; Z accuracy is not object‑aware.
Headsets will not save head orientation (but will show in the replay with the virtual screen)
Note: For Vive Focus Vision and Vive Focus 3, need to use the SRAnipal driver instead of OpenXR. Download here (You don't need to run the Vive Console, just Vive Streaming, but the Vive Console software has to be installed to have access to the SRAnipal driver)
To verify that the eye tracker is working, recommend to run SightLab_VR.py first and then press 'p' to see your gaze point moving.
If running with SteamVR, minimize the SteamVR window first so that doesn't show on top of the video
The eye tracking may need to be first calibrated on the device, and then can also calibrate the replay screen by focusing on a point in the scene and moving the REPLAY_SCREEN_CALIBRATION using the arrow keys in the replay and pressing 'k' to print the numbers you can paste into the specific hardware profile you are using. You can also manually move the screen object in the resources folder.
FAQ (Short)
Where are my files?
All sessions go under /data/<date>_<participant>/
Why does the replay look stretched/offset?
Use replay hotkeys (arrows/pgup/pgdn/+/-) to dial in the screen; then copy printed values into the config
Can I start/stop with the network?
Yes—enable NETWORK_START=True and match event names/host/port with your Unity/Unreal sender.