Skip to content

Public Speaking

Description

Note: Available Upon request. Message support@worldviz.com or sales@worldviz.com

The subject is seated in a chair and has to give a speech following instructions that appear on a monitor. A number of factors (such as the audience attitude) should have an effect on the anxiety of the subject. GSR and ECG data can be acquired via BIOPAC.

The following parameters can be manipulated by the experimenter via a separate screen:

  1. Are the other people looking at you?
  2. Are the other people displaying frowns or smiles?
  3. Are the other people showing boredom?

Running the experiment

Run PublicSpeaking_SightLab.py to start the application

Press Spacebar to fade out the gray quad and start the session

The participant is asked to deliver a short speech. They will wear an HMD. Meanwhile, the experimenter is changing parameters.

Key Mapping:

  • Scroll the text the subject is reading - Up or Down arrow keys (or subject can use LH controller X and Y buttons to do this themselves)
  • Raise or lower the curtain - "Z" or click on the button with the mouse
  • Increase or decrease the number of avatars - "+" or "-" (or click on button) Change Avatar attitudes- keys "a-h" (or click on buttons)

Session Replay and Data Files

Run SightLabVR_Replay.py to view scan paths, heatmaps and other visualizations of eye tracking and user data

View the raw data files in the data folder

Modifying

Change the scrolling text with this file config/speechOnSpeech.txt

Change other parameters in config.py

Adjust built-in SightLab parameters such as recording a video of the session to sync with Acqknowledge here

sightlab = sl.SightLab(gui=False, pid = False, screenrecord = False, biopac = True)

Can add other SightLab features with little effort (such as Multi-User, Face Tracking, 360 media, Virtual Screens, Saving transcriptions and audio, etc.)

Was this page helpful?