Adding Avatar Agents (NPCs)
Avatars in SightLab can represent non-player characters (NPCs), animated agents, conversational AI characters, or tracked representations of participants. This page focuses on adding NPC-style avatars to a scene, positioning them, animating them, and ensuring they appear correctly in data collection and replay.
Avatars can be imported in .osgb,.fbx, .glTF, or .cfg formats and work with most common avatar pipelines.
What Are You Trying to Do?
Use this table to jump to only what you need:
| Goal | Read These Sections |
|---|---|
| Show a person in the scene | Quick Start → Adding & Placing Avatars |
| Play animations on an NPC | Animating the Avatar |
| Add facial morphs | Quick Start → Animating the Avatar |
| Collect gaze or interaction data on an avatar | Data Collection & Replay |
| Add a conversational AI character | AI-Enabled Avatars |
| Represent the participant's body | Tracked / Full-Body Avatars |
Quick Start (Minimal Working Example)
The simplest way to add avatars is to place them directly in your scene using Inspector, then reference them by name in code. No hard-coded positions or extra placement utilities needed.
Step 1 – Add an avatar to your scene in Inspector
- Open your environment/scene in Inspector (can click "Edit" in the GUI to open your scene in Inspector or use the shortcut to the Inspector App)
- Use File → Add or drag and drop an avatar file into the scene
- Position and rotate the avatar using the translate and rotate tools
- Note the avatar's file name (right-click → copy name)
💡 Tip: Built-in avatars are available in the Avatars tab of the SightLab Dashboard asset browser, or you can drag and drop your own avatar files directly into Inspector.
Note: For E-Learning Lab you can drag and drop avatars and choose the animation state and morphs using the built-in E-Learning Lab GUI
Step 2 – Reference the avatar in your script
import sightlab_utils.sightlab as sl
from sightlab_utils.settings import *
sightlab = sl.SightLab()
def sightLabExperiment():
yield viztask.waitEvent(EXPERIMENT_START)
viz.callback(viz.getEventID('ResetPosition'), sightlab.resetViewPoint)
for trial in range(sightlab.getTrialCount()):
yield viztask.waitEvent(TRIAL_START)
#Copy and paste just these 4 lines into your project, changing the name and animation state you want
env = sightlab.getEnvironment()
avatarObject = env.getChild('CC2_f001_hipoly_A0_v2.cfg')
avatarObject.state(1)
sightlab.addSceneObject('avatarObject', avatarObject, gaze=True, avatar=True)
yield viztask.waitEvent(TRIAL_END)
viztask.schedule(sightlab.runExperiment)
viztask.schedule(sightLabExperiment)
That's it — the avatar is already placed in the scene, so there's no need to set positions in code.
Supported Avatar Sources
SightLab works with avatars from many common libraries:
| Source | Description | Link |
|---|---|---|
| Mixamo | Free rigged characters and animations | mixamo.com |
| Ready Player Me | Customizable avatars from photos | Workflow Guide |
| Reallusion | Character Creator & ActorCore library | reallusion.com |
| RocketBox | Microsoft's research avatar library | GitHub |
| Avaturn / MetaPerson | AI-generated avatars | Avaturn / MetaPerson |
📦 Browse more resources at the WorldViz Asset Browser
![]()
Included Avatars
Built-in avatars can be found in the Avatars tab of the SightLab Dashboard asset browser. From there you can drag and drop them directly into your scene in Inspector.
Sample avatars also ship with SightLab in these locations:
| Path | Contents |
|---|---|
sightlab_resources/avatar/full_body |
Full-body rigged avatars |
sightlab_resources/avatar/Complete Character |
Complete character models |
sightlab_utils/resources/avatar |
Utility avatars |
![]()
Adding & Placing Avatars in Inspector
The recommended workflow is to add avatars directly to your scene in Inspector (the 3D model editor tool). This gives you visual control over placement without writing any positioning code.
Step 1 – Add the Avatar
- Open your environment in Inspector
- Use File → Add or drag and drop an avatar file into the scene
- Use avatars from the Avatars tab in the SightLab Dashboard asset browser, or any
.osgb``.fbx/.glTF/.cfgfile - The avatar will appear in the scene graph
Step 2 – Position the Avatar
- Select the avatar's transform node in the scene graph
- Use the translate and rotate tools to place the avatar where you want it
- Save the scene
![]()
Step 3 – Reference in Code
Use env.getChild() with the avatar's file name to access it at runtime. You can find the name by right-clicking the avatar in Inspector and copying it.
env = sightlab.getEnvironment()
avatarObject = env.getChild('MyAvatar.cfg')
From there you can set animations, facial morphs, and register it for data collection — all without needing to set positions in code.
Verifying an Avatar in Inspector
Before adding an avatar to your scene, you can verify it standalone in Inspector:
What to Check
- Scale (meters)
- Root transform
- Available animations
- Facial morphs (if present)
To open an avatar on its own:
- Right-click the avatar file → Open with Inspector
- Or open Inspector and use File → Open
To verify scale:
- Select the top root node
- Confirm height is reasonable (≈1.6–1.8 meters)
If scaling is needed:
- Right-click the root node
- Choose Insert Above → Transform (if there is not already a transform)
- Adjust scale using the transform gizmo
![]()
You can also preview animations in the Animations tab.
Animating the Avatar
Once you have a reference to the avatar via env.getChild(), you can set animations and facial morphs directly:
env = sightlab.getEnvironment()
# Get the avatar by name
avatarObject = env.getChild('CC2_f001_hipoly_A0_v2.cfg')
# Set animation state - change number to the animation you want
avatarObject.state(1)
# Add facial morph - first number is morph to use, second is strength
avatarObject.setMorph(0, 1)
Animation indices depend on the avatar and can be previewed in Inspector's Animations tab.
Animation Examples
# Loop an animation
avatarObject.state(1)
# Play animation once
avatarObject.execute(1)
# Blend between animations
avatarObject.blend(1, 2, 0.5) # 50% blend between state 1 and 2
# Set animation speed
avatarObject.speed(1.5) # 1.5x speed
Changing Animation with a Keypress
def setStateAvatar():
avatarObject.state(2)
vizact.onkeydown('v', setStateAvatar)
Animation Triggers
You can trigger animations based on:
- Events – Gaze enter/exit, proximity, button press
- Time – After a delay or at specific trial moments
- AI Logic – Dynamic responses from intelligent agents
Data Collection & Replay
Collecting Data on Avatars
Avatars can be treated like any other SightLab scene object for data collection:
| Metric | Description |
|---|---|
| Gaze | Where participant is looking on the avatar |
| View counts | How many times avatar was viewed |
| Dwell time | Total time spent looking at avatar |
| Interaction events | Grabs, clicks, proximity triggers |
To add an avatar to the tracked SightLab replay and track gaze data, register it with addSceneObject:
avatarObject = env.getChild('CC2_f001_hipoly_A0_v2.cfg')
avatarObject.state(1)
sightlab.addSceneObject('avatarObject', avatarObject, gaze=True, avatar=True)
⚠️ Important: Setting
avatar=Trueensures the avatar appears correctly in replay. Without this flag, the avatar may not be visible during session playback.
No additional gaze-event code is required.
Related Pages
- Conversational / AI Characters
👉 AI-Enabled Avatars - Participant-Tracked Bodies
👉 Tracked / Full-Body Avatars
Creating or Capturing Animations
Animation Tools
| Tool | Best For |
|---|---|
| Kinetix | AI-powered animation from video |
| DeepMotion | Motion capture from video |
| Rokoko | Affordable mocap suits & gloves |
| Motionbuilder | Professional animation editing |
| Mocopi | Sony's portable mocap sensors |
High-End Motion Capture
| System | Description |
|---|---|
| Xsens | Inertial mocap suits |
| OptiTrack | Optical tracking systems |
| Vicon | Industry-standard optical mocap |
Helpful References
Troubleshooting
Common Issues & Fixes
| Issue | Cause | Fix |
|---|---|---|
| Avatar too large/small | Incorrect export scale | Add transform above root in Inspector and use scale tool |
| Avatar missing in replay | Missing flag | Ensure avatar=True in addSceneObject() |
| Animation not playing | Wrong index or incompatible skeleton | Verify animation index in Inspector; check skeleton compatibility |
| Avatar in T-pose | Animation not applied | Call avatarObject.state(n) after getting the child reference |
| Avatar facing wrong direction | Rotation offset | Rotate the avatar in Inspector, or use avatarObject.setEuler(180, 0, 0) |
| Avatar floating/underground | Y-position offset | Adjust position in Inspector, or fix root transform |
| Animations jittery | Frame rate or blending issue | Check animation FPS matches; avoid rapid state changes |
| Avatar not receiving gaze data | Not registered | Ensure gaze=True in addSceneObject() |
Complete Example Script
Here is a full working example with multiple avatars, animations, facial morphs, data collection, and a keypress-triggered animation change:
import sightlab_utils.sightlab as sl
from sightlab_utils.settings import *
sightlab = sl.SightLab()
def sightLabExperiment():
yield viztask.waitEvent(EXPERIMENT_START)
viz.callback(viz.getEventID('ResetPosition'), sightlab.resetViewPoint)
for trial in range(sightlab.getTrialCount()):
yield viztask.waitEvent(TRIAL_START)
env = sightlab.getEnvironment()
# Add Avatar - get name from Inspector by right clicking and copying
avatarObject = env.getChild('CC2_f001_hipoly_A0_v2.cfg')
# Set animation state - change number to animation you want
avatarObject.state(1)
# Add facial morph - first number is morph to use, second is strength
avatarObject.setMorph(0, 1)
# To add to the tracked SightLab Replay and track gaze data
sightlab.addSceneObject('avatarObject', avatarObject, gaze=True, avatar=True)
avatarObject2 = env.getChild('CC2_m001_hipoly_A0_v2.cfg')
avatarObject2.state(1)
sightlab.addSceneObject('avatarObject2', avatarObject2, gaze=True, avatar=True)
# Change animation state with a keypress
def setStateAvatar():
avatarObject2.state(2)
vizact.onkeydown('v', setStateAvatar)
yield viztask.waitEvent(TRIAL_END)
viztask.schedule(sightlab.runExperiment)
viztask.schedule(sightLabExperiment)
An additional example is included at:
ExampleScripts/Adding_NPCs.py
Alternative: Loading Avatars in Code
If you prefer to load avatars programmatically instead of placing them in Inspector, you can still do so:
avatar = vizfx.addAvatar('sightlab_resources/avatar/full_body/RocketBox_Male1.osgb')
avatar.setPosition(0, 0, 2)
avatar.state(1)
sightlab.addSceneObject("avatar", avatar, gaze=True, avatar=True)
You can also use the (legacy) stand-in avatar method with avatarPlacer for code-loaded avatars that need visual placement:
- Add
standInAvatar.osgb(fromsightlab_resources/objects) to your scene in Inspector - Position the stand-in where you want the avatar
- In code, use
avatarPlacerto place the real avatar at the stand-in's location:
from sightlab_utils import avatarPlacer
env = sightlab.getEnvironment()
avatarPlacer.place(env, avatar, 'avatarStandin')
To hide the stand-in during replay:
def onTrialChanged():
env = replay.getEnvironmentObject()
env.getChild('avatarStandin').alpha(0)
viz.callback(TRIAL_LOADED_EVENT, onTrialChanged)
onTrialChanged()
More information on the Vizard documentation for Avatars