Shopping Visual Search
This experiment simulates a visual search task within a virtual supermarket aisle. Participants are instructed to locate and grab a target item from a dense array of shelf-placed distractor items using gaze and controller-based interaction.
Overview
Participants are placed in a VR environment and presented with instructions to locate a specific food item (e.g., Breakfast Tuna). Items are randomly placed across a left and right shelf grid, with a single target placed randomly among them.
When the target is found (grabbed), feedback is presented (image quad and sound), and the time taken to locate the target is recorded to the experiment summary.
Core Features
- Random placement of 3D food models on two opposing shelves.
- Trial logic with instructions and fade-in/out transitions.
- Grab detection with optional visual feedback.
- Timing from trial start to object grab logged as
"Time to Find Target"
. - Image-based instructions head-locked to the participant's view.
- Optional audio and environment constraints.
Configurable Parameters
Variable | Description | Default Value |
---|---|---|
TARGET_OBJECT |
The internal name of the target item to find | "BreakfastTuna" |
TARGET_FILE |
The filename of the .osgb model for the target | "f_breakfastjerky_7A.osgb" |
SHOW_ITEM_INFO |
Whether to display a quad image above found object | True |
left_origin / right_origin |
Grid positions for shelf start locations | Custom 3D coords |
cols , rows |
Grid layout dimensions for shelf item placement | 25 cols x 5 rows |
Components
ShelfGrid System
The ShelfGrid
class (grid_placer.py
) provides methods to position items in a grid with optional rotation and depth offset.
Modify:
- Grid size and orientation
- Sampling behavior (use
.sample_positions(n)
if desired)
Scene Object Management
To add each item into the scene with SightLab tracking capabilities (gaze and grab), use:
sightlab.addSceneObject(item_name, item, gaze=True, grab=True)
All added objects are also tracked in a scene_items
list for batch visibility control:
for item in scene_items:
item.visible(viz.OFF) # Hide all items for example
Instructions Panel
Image-based instructions are locked to the participant's head using the following logic:
headRoot = viz.addGroup()
viz.link(mainViewNode, headRoot, mask=viz.LINK_ALL).preTrans([0, -2.0, 1.2])
sightlab.instructionsImageQuad.setParent(headRoot)
You can scale the quad to suit portrait-style instructions:
sightlab.instructionsImageQuad.setScale([1.2, 1.8, 1]) # Taller portrait
For landscape instructions, consider:
sightlab.instructionsImageQuad.setScale([1.8, 1.2, 1]) # Wider landscape
Grabbing and Timing
When the participant grabs the target item:
- ✅ Sound feedback is played.
- ✅ A floating image quad can be displayed over the item.
- ✅ The time from trial start to grab is measured with:
time_to_find = round(sightlab.hud.getCurrentTime() - trial_start_time, 3)
- ✅ The result is saved to the experiment summary:
sightlab.setExperimentSummaryData("Time to Find Target", time_to_find)
Running the Experiment
To run the full experiment:
- Run
Main_Shopping_Search.py
- Choose options such as if running BIOPAC, eye tracking thresholds and click
Continue
- Choose hardware from the dropdown (note: this list can be modified)
- Enter Participant ID and click
Submit
- Put on the headset, read the instructions and press the trigger to begin the task
- Look for the intended target, use the trigger buttons to grab items (target is identified when grabbed)
- Additional controls here
- After item is found, press
Spacebar
to end the session and the data will be saved (might take a few moments) - Run
SightLabVR_Replay
to see a replay of the session with visualizations - View the data in the
data
folder
Possible Modifications
-
Place new
.osgb
model files inresources/foods/
-
Set
gui=False
when initializingSightLab()
if not using the GUI -
Configure
vizconnect
for your headset, controller, and eye-tracker -
Item Sampling Replace:
grid.all_positions()
with:
grid.sample_positions(n)
to reduce the number of distractor items.
- ROI Tracking
Use the Scene Editor (Inspector) to place
RegionOfInterest.osgb
objects and register them with:
sightlab.addSceneObject("roi_name", roi_object, gaze=True)
- Audio Feedback Replace sound files:
backgroundAudio = viz.addAudio('Resources/audio/your_custom_loop.wav')
foundAudio = viz.addAudio('Resources/audio/your_success_sound.wav')
- Dynamic Instructions You can show multiple instruction slides:
yield sightlab.showInstructions(image='sightlab_resources/instruction1.jpg')
yield viztask.waitEvent('triggerPress')
yield sightlab.showInstructions(image='sightlab_resources/instruction2.jpg')
Assets Structure
textCopyEditresources/
├── audio/
│ ├── supermarket.wav
│ └── tada.wav
├── foods/
│ ├── f_breakfastjerky_7A.osgb
│ ├── ... (other distractor items)
└── images/
└── item_found.png
Notes
- Use
sightlab.getEnvironment().visible(viz.OFF)
to hide the environment at startup. - Use optimized or low-poly
.osgb
models for better performance. - Be sure to add all objects using
sightlab.addSceneObject(..., gaze=True, grab=True)
to ensure proper data collection.
Dependencies
sightlab_utils
viztask
,vizconnect
,vizproximity
- Custom tools:
grid_placer.py
tools/grabber.py
Authoring Tips
-
Use Inspector to place and align 3D models on shelves.
-
Adjust object orientation with
.getEuler()
and.setEuler()
. -
Enable proximity debug with:
manager.setDebug(viz.ON)
Additional Options
This experiment can be extended in a variety of ways depending on your research goals:
- Multi-user mode (server/client) for collaborative or competitive tasks
- Biopac integration for synchronized physiological data collection
- Real-time biofeedback to visualize participant state (e.g., heart rate, EDA)
- Timed trials with fixed durations or dynamic cutoffs
- User surveys and self-report (ratings, open-text input)
- Custom event tracking (e.g., time in ROI, distance walked)
- Condition randomization or block-based designs
- Gaze zone and ROI analysis using tagged scene regions
- Environmental changes like lighting, sound, or object density per trial
- GUI integration for managing trial parameters, objects, and logging visually
These features can be enabled or combined depending on your experimental design needs