Skip to content

Adding Avatar Agents (NPCs)

Avatars in SightLab can represent non-player characters (NPCs), animated agents, conversational AI characters, or tracked representations of participants. This page focuses on adding NPC-style avatars to a scene, positioning them reliably, animating them, and ensuring they appear correctly in data collection and replay.

Avatars can be imported in .fbx, .glTF, or .cfg formats and work with most common avatar pipelines.


What Are You Trying to Do?

Use this table to jump to only what you need:

Goal Read These Sections
Show a static person in the scene Quick Start → Placing Avatars
Play animations on an NPC Preparing an Avatar → Animations
Collect gaze or interaction data on an avatar Adding to SightLab
Add a conversational AI character AI-Enabled Avatars
Represent the participant’s body Tracked / Full-Body Avatars

Quick Start (Minimal Working Example)

If you just want an avatar visible in the scene with data collection and replay support, this is the fastest way to start.

# Load the avatar
avatar = vizfx.addAvatar('sightlab_resources/avatar/full_body/RocketBox_Male1.osgb')

# Position and animate
avatar.setPosition(0, 0, 2)
avatar.state(1)  # Play animation state

# Register with SightLab for data collection
sightlab.addSceneObject(avatar, gaze=True, avatar=True)

💡 Tip: This approach is perfect for quick prototyping. For production experiments, use the stand-in avatar method for more precise, repeatable placement.

When this is enough:

  • Single avatar
  • No need for precise placement via Inspector
  • No stand-in or scene editing

Supported Avatar Sources

SightLab works with avatars from many common libraries:

Source Description Link
Mixamo Free rigged characters and animations mixamo.com
Ready Player Me Customizable avatars from photos Workflow Guide
Reallusion Character Creator & ActorCore library reallusion.com
RocketBox Microsoft's research avatar library GitHub
Avaturn / MetaPerson AI-generated avatars Avaturn / MetaPerson

📦 Browse more resources at the WorldViz Asset Browser

Example included avatars

Included Avatars

Sample avatars ship with SightLab in these locations:

Path Contents
sightlab_resources/avatar/full_body Full-body rigged avatars
sightlab_resources/avatar/Complete Character Complete character models
sightlab_utils/resources/avatar Utility avatars

Example included avatars


Before adding an avatar to your experiment, it’s best to verify it in Inspector.

What to Check in Inspector

  • Scale (meters)
  • Root transform
  • Available animations
  • Facial morphs (if present)

To open an avatar:

  • Right-click the avatar file → Open with Inspector
  • Or open Inspector and use File → Open

To verify scale:

  • Select the top root node
  • Confirm height is reasonable (≈1.6–1.8 meters)

If scaling is needed:

  1. Right-click the root node
  2. Choose Insert Above → Transform (if there is not already a transform)
  3. Adjust scale using the transform gizmo

Inspector showing avatar scale and root node

You can also preview animations in the Animations tab.


Adding the Avatar to Your Project

Place avatar files inside your project directory. A recommended structure:

Project/
└── Resources/
    └── avatars/
        └── MyAvatar.osgb

Then load it in your script:

avatar = vizfx.addAvatar('Resources/avatars/MyAvatar.osgb')
sightlab.addSceneObject("avatar", avatar, gaze=True, avatar=True)

⚠️ Important: Setting avatar=True ensures the avatar appears correctly in replay. Without this flag, the avatar may not be visible during session playback.


Placing Avatars in the Scene (Best Practice)

Why Use a Stand-In Avatar?

Using a placeholder avatar in the environment lets you:

  • Position avatars visually
  • Avoid hard-coded coordinates
  • Keep placement consistent across sessions

Step 1 – Add a Stand-In Avatar in Inspector

  1. Open your environment in Inspector
  2. Go to File → Add
  3. Select standInAvatar.osgb
  4. Located in sightlab_resources/objects
  5. Or can directly place your avatar and use the root node in the later step
  6. Select the avatarTransform (or first transform node)
  7. Translate and rotate the stand-in to the desired position

Stand-in avatar placement in Inspector


Step 2 – Place the Real Avatar with avatarPlacer

Import the module:

from sightlab_utils import avatarPlacer

After SightLab starts:

env = sightlab.getEnvironment()
avatarPlacer.place(env, avatar, 'avatarStandin')

To add more avatars:

  • Duplicate the stand-in
  • Rename it (e.g. avatarStandin2)
  • Call avatarPlacer.place() again
avatarPlacer.place(env, secondAvatar, 'avatarStandin2')

Using GEODE or Avatar Root Nodes

avatarPlacer works with:

  • GEODE nodes
  • Avatar root nodes
avatarPlacer.place(env, avatar, "GEODE_NODE_NAME")
# or
avatarPlacer.place(env, avatar, "avatarRoot")

Scene graph showing GEODE selection


Animating the Avatar

To play an animation state:

avatar.state(1)  # Play animation at index 1

Animation indices depend on the avatar and can be previewed in Inspector's Animations tab.

Animation Examples

# Loop an animation
avatar.state(1)

# Play animation once
avatar.execute(1)

# Blend between animations
avatar.blend(1, 2, 0.5)  # 50% blend between state 1 and 2

# Set animation speed
avatar.speed(1.5)  # 1.5x speed

Animation Triggers

You can trigger animations based on:

  • Events – Gaze enter/exit, proximity, button press
  • Time – After a delay or at specific trial moments
  • AI Logic – Dynamic responses from intelligent agents

Data Collection & Replay

Collecting Data on Avatars

Avatars can be treated like any other SightLab scene object for data collection:

Metric Description
Gaze Where participant is looking on the avatar
View counts How many times avatar was viewed
Dwell time Total time spent looking at avatar
Interaction events Grabs, clicks, proximity triggers

Register the avatar with SightLab:

sightlab.addSceneObject(
    "avatar",
    avatar,
    gaze=True,
    avatar=True)

No additional gaze-event code is required.

Replay Integration

Hide the Stand-In Avatar in Replay

Add this to your replay script to hide the placeholder:

def onTrialChanged():
    env = replay.getEnvironmentObject()
    env.getChild('avatarStandin').alpha(0)

viz.callback(TRIAL_LOADED_EVENT, onTrialChanged)
onTrialChanged()

This keeps the real avatar visible while removing setup artifacts.



Creating or Capturing Animations

Animation Tools

Tool Best For
Kinetix AI-powered animation from video
DeepMotion Motion capture from video
Rokoko Affordable mocap suits & gloves
Motionbuilder Professional animation editing
Mocopi Sony's portable mocap sensors

High-End Motion Capture

System Description
Xsens Inertial mocap suits
OptiTrack Optical tracking systems
Vicon Industry-standard optical mocap

Helpful References


Troubleshooting

Common Issues & Fixes

Issue Cause Fix
Avatar too large/small Incorrect export scale Add transform above root in Inspector and use scale tool
Avatar missing in replay Missing flag Ensure avatar=True in addSceneObject()
Stand-in visible in replay Not hidden Set alpha(0) on stand-in in replay script
Animation not playing Wrong index or incompatible skeleton Verify animation index in Inspector; check skeleton compatibility
Avatar in T-pose Animation not applied Call avatar.state(n) after loading
Avatar facing wrong direction Rotation offset Use avatar.setEuler(180, 0, 0) to rotate
Avatar floating/underground Y-position offset Adjust Y in setPosition() or fix root transform
Animations jittery Frame rate or blending issue Check animation FPS matches; avoid rapid state changes
Avatar not receiving gaze data Not registered Ensure gaze=True in addSceneObject()

Sample Script

A complete working example is included at:

ExampleScripts/Adding_NPCs.py

More information on the Vizard documentation for Avatars

Was this page helpful?