Adding Avatar Agents (NPCs)
Avatars in SightLab can represent non-player characters (NPCs), animated agents, conversational AI characters, or tracked representations of participants. This page focuses on adding NPC-style avatars to a scene, positioning them reliably, animating them, and ensuring they appear correctly in data collection and replay.
Avatars can be imported in .fbx, .glTF, or .cfg formats and work with most common avatar pipelines.
What Are You Trying to Do?
Use this table to jump to only what you need:
| Goal | Read These Sections |
|---|---|
| Show a static person in the scene | Quick Start → Placing Avatars |
| Play animations on an NPC | Preparing an Avatar → Animations |
| Collect gaze or interaction data on an avatar | Adding to SightLab |
| Add a conversational AI character | AI-Enabled Avatars |
| Represent the participant’s body | Tracked / Full-Body Avatars |
Quick Start (Minimal Working Example)
If you just want an avatar visible in the scene with data collection and replay support, this is the fastest way to start.
avatar = vizfx.addAvatar('sightlab_resources/avatar/full_body/RocketBox_Male1.osgb')
avatar.setPosition(0, 0, 2)
avatar.state(1) # Play animation state
sightlab.addSceneObject(avatar,gaze=True,avatar=True)
When this is enough
- Single avatar
- No need for precise placement via Inspector
- No stand-in or scene editing
If you need repeatable placement, multiple avatars, or clean replay visuals, continue below.
Supported Avatar Sources
SightLab works with avatars from many common libraries:
- Mixamo
- Ready Player Me
- Reallusion (Character Creator / ActorCore)
- RocketBox
- Avaturn / MetaPerson
- See more links at the Worldviz Asset Browser
![]()
Included Avatars
Sample avatars ship with SightLab and are available here:
sightlab_resources/avatar/full_bodysightlab_resources/avatar/Complete Charactersightlab_utils/resources/avatar
Additional avatars:
![]()
Preparing an Avatar Asset (Recommended)
Before adding an avatar to your experiment, it’s best to verify it in Inspector.
What to Check in Inspector
- Scale (meters)
- Root transform
- Available animations
- Facial morphs (if present)
To open an avatar:
- Right-click the avatar file → Open with Inspector
- Or open Inspector and use File → Open
To verify scale:
- Select the top root node
- Confirm height is reasonable (≈1.6–1.8 meters)
If scaling is needed:
- Right-click the root node
- Choose Insert Above → Transform (if there is not already a transform)
- Adjust scale using the transform gizmo
![]()
You can also preview animations in the Animations tab.
Adding the Avatar to Your Project
Place avatar files inside your project directory. A recommended structure:
Project/
└── Resources/
└── avatars/
└── MyAvatar.osgb
Then load it in your script:
avatar = vizfx.addAvatar('Resources/avatars/MyAvatar.osgb')
sightlab.addSceneObject("avatar",avatar,gaze=True,avatar=True)
Important:
Settingavatar=Trueensures the avatar appears correctly in replay.
Placing Avatars in the Scene (Best Practice)
Why Use a Stand-In Avatar?
Using a placeholder avatar in the environment lets you:
- Position avatars visually
- Avoid hard-coded coordinates
- Keep placement consistent across sessions
Step 1 – Add a Stand-In Avatar in Inspector
- Open your environment in Inspector
- Go to File → Add
- Select
standInAvatar.osgb - Located in
sightlab_resources/objects - Or can directly place your avatar and use the root node in the later step
- Select the
avatarTransform(or first transform node) - Translate and rotate the stand-in to the desired position
![]()
Step 2 – Place the Real Avatar with avatarPlacer
Import the module:
from sightlab_utils import avatarPlacer
After SightLab starts:
env = sightlab.getEnvironment()
avatarPlacer.place(env, avatar, 'avatarStandin')
To add more avatars:
- Duplicate the stand-in
- Rename it (e.g.
avatarStandin2) - Call
avatarPlacer.place()again
avatarPlacer.place(env, secondAvatar, 'avatarStandin2')
Using GEODE or Avatar Root Nodes
avatarPlacer works with:
- GEODE nodes
- Avatar root nodes
avatarPlacer.place(env, avatar, "GEODE_NODE_NAME")
# or
avatarPlacer.place(env, avatar, "avatarRoot")
![]()
Animating the Avatar
To play an animation state:
avatar.state(1)
Animation indices depend on the avatar and can be previewed in Inspector.
You can also:
- Blend animations
- Trigger animations via events
- Drive animations from AI or interaction logic
Collecting Data on Avatars
Avatars can be treated like any other SightLab scene object:
- Gaze
- View counts
- Dwell time
- Interaction events
Ensure:
sightlab.addSceneObject(
"avatar",
avatar,
gaze=True,
avatar=True
)
No additional gaze-event code is required.
Replay Integration
Hide the Stand-In Avatar in Replay
Add this to your replay script:
def onTrialChanged():
env = replay.getEnvironmentObject()
env.getChild('avatarStandin').alpha(0)
viz.callback(TRIAL_LOADED_EVENT, onTrialChanged)
onTrialChanged()
This keeps the real avatar visible while removing setup artifacts.
Related Pages
- Conversational / AI Characters
👉 AI-Enabled Avatars - Participant-Tracked Bodies
👉 Tracked / Full-Body Avatars
Creating or Capturing Animations
Animation Tools
- Kinetix
- DeepMotion
- Rokoko
- Motionbuilder
- Mocopi
High-End Motion Capture
- Xsens
- OptiTrack
- Vicon
Helpful References
Common Issues & Fixes
| Issue | Fix |
|---|---|
| Avatar too large/small | Add transform above root in Inspector and use the scale tool |
| Avatar missing in replay | Ensure avatar=True |
| Stand-in visible in replay | Set alpha to 0 |
| Animation not playing | Verify animation index and skeleton |
Sample Script
A complete working example is included at:
ExampleScripts/Adding_NPCs.py