SightLab Experiment Templates

For a list of the Common Metrics and Tools Available for Each Experiment, Click Here

Choice & Perception Templates

Experiment Name Category What Can Be Manipulated What Can Be Measured Additional Features
Choice Comparison Template Consumer Preference Studies, Product Placement, Decision Making Strategies
  • Input method (laser pointer, rating scale, keypress)
  • Objects being compared (e.g., food items, products)
  • Choice categories (e.g., healthy vs. non-healthy food choices)
  • Background environment (e.g., supermarket, restaurant, neutral scene)
  • Trial conditions (e.g., start/stop conditions)
  • Object chosen
  • Choice confirmation time
  • Eye-tracking (gaze points, dwell time)
  • Physiological responses (if Biopac connected)
  • Interactive replay
  • Biopac integration
  • Automatic graph generation (bar charts of selections)
Distance Perception Task Perception Studies, Spatial Awareness
  • Object positions (randomized X and Z shifts)
  • Number of practice trials
  • Comparison condition (e.g., Time to First Fixation, Performance)
  • Environment and object selection
  • Performance (correct identification of unchanged object)
  • Time to First Fixation
  • Eye-tracking (saccade data, intersect positions, individual eye rotation)
  • Head position
  • View counts, dwell time, heatmaps, scan paths
  • Walk path
  • No-code GUI setup available
  • Data saved with all included SightLab metrics
  • Supports environment and condition customization
Memory Experiment Spatial Recognition, Cognitive Science
  • Number of objects and colors
  • Object placement
  • Trial count
  • Learning phase duration
  • Environment selection
  • Time taken to reach each target
  • Eye-tracking (gaze behavior, fixation data)
  • Path taken to each object
  • Performance (correct selections)
  • Walk path
  • No-code GUI setup available
  • Supports custom object and environment settings
  • Automatic logging of participant performance data

Reaction Time Templates

Experiment Name Category What Can Be Manipulated What Can Be Measured Additional Features
Basic Reaction Time Task Cognitive Processing, Reflex Measurement
  • Object positions
  • Fade-in speed
  • Environment model
  • Reaction time to visual stimulus
  • No-code GUI setup available
  • Customizable fade-in duration
Reaction Time with Selector Tool Cognitive Processing, Interaction Studies
  • Object positions
  • Trial count
  • Environment model
  • Reaction time to highlight and confirm selection
  • Uses gaze or cursor tracking
  • Supports trial customization
Shoot/Don't Shoot Task Decision-Making Under Pressure
  • Number of threat and neutral objects
  • Object spawn positions
  • Environment model
  • Reaction time
  • Decision accuracy
  • Tracks responses to threat vs. neutral objects
  • Customizable object list
Shoot/Don't Shoot in a Maze Decision-Making Under Pressure, Navigation Tasks
  • Maze structure
  • Proximity sensor placement
  • Object spawn logic
  • Reaction time
  • Decision accuracy
  • Walk path
  • Requires navigation through the maze
  • Customizable task duration

Phobia & Exposure Templates

Experiment Name Category What Can Be Manipulated What Can Be Measured Additional Features
Fear of Heights (Walk the Plank) Phobia Exposure, Anxiety Research
  • Height of plank
  • Surrounding environment
  • Trial conditions
  • Physiological response
  • Eye-tracking (gaze behavior)
  • Walk path
  • No-code GUI setup available
  • Biopac integration for physiological tracking
Elevator, Airplane, Arachnophobia Phobia Exposure, Anxiety Research
  • Type of environment
  • Exposure duration
  • Object interaction level
  • Physiological response
  • Eye-tracking
  • Walk path
  • Supports gradual exposure therapy
  • Customizable scenarios
BioFeedback Exposure Anxiety Management, Physiological Control
  • Exposure intensity
  • Feedback mechanism
  • Physiological response
  • Control effectiveness
  • Real-time biofeedback
  • Progressive difficulty levels
General Stimulus Exposure Sensory Processing & Cognitive Load
  • 3D models (VR and AR, also real world objects in VR)
  • 360 media
  • Virtual screens
  • Standard media
  • Eye-tracking (gaze behavior)
  • Interaction engagement
  • Supports different media types
  • No-code GUI setup available

Interaction & Simulation Templates

Experiment Name Category What Can Be Manipulated What Can Be Measured Additional Features
Mirror Demo Self-Perception, Embodiment Research
  • Mirror positioning
  • Environment lighting
  • Self-directed gaze patterns
  • Movement synchronization
  • Real-time reflection
  • Customizable environment
Grabbing and Reaching Task Motor Control, Physical Interaction
  • Object distance
  • Object size
  • Object complexity
  • Reach accuracy
  • Grasp precision
  • Completion time
  • Hand tracking integration
  • Haptic feedback options
Scripted Avatars Social Interaction & Bias Studies
  • Avatar gender, race, voice
  • Text-based dialogue
  • Participant response time
  • Eye-tracking
  • Avatars can read from text file
  • Dynamic conversation scenarios
Driving Simulator Context Spatial Awareness, Reaction Time
  • Road conditions
  • Traffic density
  • Environmental distractions
  • Reaction time
  • Decision accuracy
  • Eye-tracking
  • Can be combined with physiological tracking
  • Supports hazard response testing
AI-Enabled Role-Playing Agents Social Simulation, Behavioral Research
  • AI response style
  • Scenario complexity
  • Agent personality traits
  • Interaction time
  • Response patterns
  • Dynamic AI responses
  • Supports immersive training scenarios
External Data Recorder Multi-Modal Data Collection
  • Recording parameters
  • Integration with external systems
  • Multiple data streams
  • Synchronized recordings
  • Compatible with various data formats
  • Supports real-time monitoring

Common Metrics and Tools

Category Available Metrics & Tools
Eye Tracking Specific
  • Gaze event count per object
  • Dwell time per object
  • Fixations and saccades
  • Pupil diameter and eye openness (headset-dependent)
  • Time to First Fixation
  • Fixation Sequence Analysis
  • Area of Interest (AOI) Analysis
  • Gaze Contingent Display
Visualizations
  • Heatmaps
  • Scan Paths
  • Fixation and Dwell Spheres
  • Barcharts and comparison graphs
  • Interactive playback
  • Multi-user interactions
Body Position Tracking
  • Head and Hand Position (6 DOF)
  • Full body and finger tracking (hardware-dependent)
  • Face tracking and expression output
Experiment Metrics
  • Experiment management (trial numbers, participant data)
  • Proximity-based interactions
  • Survey and rating tools
  • Button clicks, interactions, grab and release events
  • Data analysis for independent vs. dependent variables
Third-Party Connections
  • Physiological data (BIOPAC, FNIR, EEG via Lab Streaming Layer)
  • Synchronization with Acqknowledge
  • Real-time biofeedback