Skip to content

Full Body Tracked Avatars

(Single-User, Multi-User, OpenXR, Vive Trackers, Mocopi & More)

SightLab 2.6 and later include built-in support for Full and Upper Body Tracked Avatars using an enhanced version of Vizard’s Inverse Kinematics (IK) system. This allows experiments to use:

  • Full body avatars for single-user or multi-user sessions
  • A wide range of tracking inputs (OpenXR passthrough body tracking, Vive Trackers, VR Headset and Controller sensors, Desktop keyboard and mouse, OptiTrack/Xsens full rigs, and more. Note there may be limitations on the Multi-User functionality with some systems).
  • Lip movement animations when speech is detected
  • Finger tracking through OpenXR, Manus gloves, Meta Quest Pro/3 controllers, or external systems (finger tracking for multi-user only available using OpenXR with controllers for finger curls, while single user allows full finger movement)
  • Optional mirror views for self-observation
  • Automatic integration with SightLab’s data logging, IK, avatar selection, and multi-user networking

1. Enabling Full Body Avatars

1.1. From the GUI (Single or Multi-User)

In both the SightLab GUI and Multi-User Client, simply:

  1. Open the project
  2. Go to Avatar Settings
  3. Enable ✔ Full Body

This unlocks the list of compatible avatars stored in:

C:\Program Files\WorldViz\Sightlab2\sightlab_resources\avatar\TrackedAvatars

Avatar Preview

  • Multi-User Client shows the preview automatically.
  • Single-user GUI → click Show to display the avatar in the scene.

2. Avatar Compatibility

3.1 Recommended Avatar Library (Current Limitations)

Tracking Method Fully Supported Avatars Notes
OpenXR passthrough (single-user) Complete Characters, ReadyPlayerMe, Mixamo, Reallusion, Avaturn, Metaperson Most flexible; full finger tracking
OpenXR passthrough (multi-user) Complete Characters only Finger tracking only from Quest controllers
Vive Trackers / Mocopi / OptiTrack / Xsens Complete Characters only Other avatar types pending full retargeting support
Replay Mode Not yet supported, but will show gaze position Full body playback not yet available

Additional avatar library support is planned (Mixamo, ReadyPlayerMe, Reallusion) for all tracking methods.

For more avatars than the ones included contact support@worldviz.com or sales@worldviz.com for optional avatar packages.

Click here to see list of Complete Characters avatars available

There is planned support for additional avatar libraries to work in all tracking methods (such as Mixamo, ReadyPlayerMe, Reallusion, etc.), but for now only Complete Characters work outside of the single user OpenXR method.

For additional avatar libraries see the SightLab Asset Browser and the "Avatar Libraries" section


3. Tracking Systems

SightLab supports multiple tracking workflows, ranging from simple controller-based setups to full-body motion capture. This section details the behavior and limitations of each tracking method.


3.1 Standard VR Headset + Controllers (Head/Hands Only)
(Vive Focus Vision, Quest 2, Vive Pro, Vive Pro Eye, HP Omnicept, Varjo, etc.)

This is the most common setup and the default when no body tracking hardware is present.

Avatar Behavior

  • Head follows the headset
  • Hands follow the controllers
  • Upper-body IK only
  • Torso, shoulders, elbows move via IK
  • Legs remain in a static standing pose (no lower-body IK)

Notes

  • Works in Single-User and Multi-User
  • Not full body tracking — upper body only

3.2 Desktop Mode (No VR Hardware)

Desktop mode simulates movement without VR equipment.

Controls:
- WASD – Move the avatar
- Mouse – Look / rotate head
- Scroll wheel – Raise/lower main hand
- Left click – Trigger interactions

Useful for development or desktop-based studies.


3.3 OpenXR Passthrough (Upper Body + Finger Curls)
(Quest 3, Quest Pro — device dependent)

This is the only method that supports controller-based finger curls from Quest Pro/Quest 3 for Multi-User.

Avatar Behavior

  • Head follows HMD
  • Hands follow controllers
  • Upper-body IK
  • Legs remain in static standing pose (no lower-body IK)
  • Finger curl tracking works ONLY for Quest Pro / Quest 3 controllers

Finger Tracking Details

  • Single-User: full finger control
  • Multi-User: finger motion synced only via controller-based finger curls (not full hand joints)

Limitations

  • Lower-body motion is not recorded or replayed
  • No hip/knee/leg tracking from OpenXR at this time

🔗 OpenXR Setup Guide (Quest Pro / Quest 3 / Focus Vision / Varjo)


3.4 Vive Trackers / Vive Ultimate Trackers (Full Body)
(True full-body tracking)

Vive Trackers provide genuine full-body tracking including:

  • Hips
  • Feet
  • Chest
  • Elbows (optional)

Setup

  • Connect via SteamVR or Vive Business Streaming
  • Assign body roles in SteamVR/VBS
  • SightLab automatically applies the transforms through IK

Notes

  • Works in both single and multi-user projects
  • Currently only avatars from Complete Characters

🔗 Vive Trackers Setup Guide


3.5 Sony Mocopi (Full Body Orientation Tracking)

Mocopi is a 6-point IMU tracker system for portable full-body tracking.

Setup

  • Use the Mocopi SteamVR Application
  • SteamVR sends Mocopi joint data into SightLab
  • IK applies Mocopi data to the avatar

Notes

  • Full-body motion supported
  • IMU-based tracking
  • Supported in single and multi-user

3.6 OptiTrack, Xsens, Vicon, AR51, and Other Motion Capture Systems (Full Body)

These systems provide high-accuracy full-body tracking for research labs.

Setup

  • Stream data via Device Specific Software (MVN, etc.), Motion Builder, or system-specific exporters
  • OptiTrack works via vizconnect, Motionbuilder, hardware specific protocol, or custom Vizard plugins
  • Xsens MVN supported via MVN → MotionBuilder or custom Vizard integrations
  • See links below for setup procedures

Notes

  • Highest accuracy full-body solution
  • Ideal for lab-scale, high accuracy capturing
  • Finger tracking requires gloves or additional hardware
  • Requires custom per-system setup

🔗 OptiTrack Setup Guide
🔗 Xsens Setup Guide
🔗 Additional Supported Devices


3.7 Hand/Finger Tracking Inputs

SightLab supports finger tracking from multiple sources:

  • Meta Quest Pro / Quest 3 – controller-based finger curls
  • Manus gloves – via Manus plug-ins
  • OpenXR hand tracking – depending on headset
  • Cyberglove

Data is automatically forwarded to the IK avatar.

Hand tracking examples


3.8 Face Tracking

SightLab supports face tracking for facial expression and eye movement capture via OpenXR:

  • Meta Quest Pro / Quest 3 – built-in face and eye tracking
  • Vive Focus Vision – with Face Tracker add-on accessory

🔗 Face Tracking Setup Guide


4. Data Logging Behavior

SightLab automatically records core gaze, head, and interaction data for every session. Additional body or finger data can be logged manually by the user.

4.1 Data Logged by Default

SightLab automatically saves the following metrics regardless of hardware:

Head & Hands

  • Head pose (position + rotation)
  • Hand pose (position + rotation)
  • Controller button states

Gaze & Fixations

  • Combined and per-eye gaze intersection points
  • Fixations, saccades, and dwell metrics
  • Gaze timestamps and event markers

4.2 Optional: Logging Additional Body or Finger Data

Researchers can add custom trial data columns for any avatar bone, tracker, or finger joint using:

sightlab.setCustomTrialData(value, "ColumnName")

You may optionally log:
- Additional finger joint or curl values
- Any bone position or rotation
- Any VR device tracker (hip, foot, elbow, chest, etc.)

Finger Curls (User-Added Logging)
SightLab does not save detailed finger joint values by default. Finger curl logging can be added using the example:
ExampleScripts\Hand_Tracking_Grabbing_Physics

This demonstrates:
- Reading Quest Pro / Quest 3 finger curl data
- Adding custom trial data columns
- Saving pinch, grip, and joint values


4.3 Optional: Logging Full Body Joint Positions

Users may record specific joint positions or rotations (hips, feet, elbows, etc.) using vizconnect.getTracker().

Example: Logging right-hand tracker pose every frame

import vizact

rightHandTracker = vizconnect.getTracker("r_hand_tracker").getNode3d()
leftHandTracker  = vizconnect.getTracker("l_hand_tracker").getNode3d()

def log_right_hand():
    # Get position and orientation
    pos = rightHandTracker.getPosition()
    rot = rightHandTracker.getEuler()

    # Round for readability
    pos = [round(x, 3) for x in pos]
    rot = [round(x, 3) for x in rot]

    # Add custom columns
    sightlab.setCustomTrialData(str(pos[0]), "RHand X")
    sightlab.setCustomTrialData(str(pos[1]), "RHand Y")
    sightlab.setCustomTrialData(str(pos[2]), "RHand Z")

    sightlab.setCustomTrialData(str(rot[0]), "RHand Yaw")
    sightlab.setCustomTrialData(str(rot[1]), "RHand Pitch")
    sightlab.setCustomTrialData(str(rot[2]), "RHand Roll")

# Update every frame
vizact.onupdate(0, log_right_hand)

This pattern can be used for:
- Hip, foot, chest, or elbow trackers
- Wrist-mounted markers
- Glove-based systems
- Any avatar bone via avatar.getBone()
- Any custom transform in the scene


4.4 Replay Mode Limitations

Replay mode currently supports:
- Head pose
- Gaze and fixations
- Interaction events

Replay does not yet support:
- Full-body playback
- Finger tracking playback


4.5 Full List of Saved Metrics

For a complete list of all default metrics (fixations, saccades, dwell time, object interactions, etc.), see:

🔗 Common Metrics in SightLab
https://help.worldviz.com/sightlab/common-metrics/


5. Motion Capture for Avatar Animations

Recorded motion capture animations can be used for avatar playback in SightLab. There are many options for creating animations, from AI-based video conversion to high-end motion capture systems. Animations can be exported via MotionBuilder or other tools and applied to avatars.

AI-Based Animation from Video

These tools convert standard video (captured with a phone or camera) into animations using AI algorithms:

High-End Motion Capture Systems

For research-grade accuracy:

  • Xsens – Inertial motion capture
  • OptiTrack – Optical motion capture
  • Vicon – Optical motion capture
  • AR51 – Markerless motion capture

Refer to the Supported Devices page for more supported devices for motion capture.

6. Showing Avatars in the Replay

since the full body avatars are not using the same avatar for the default SightLab you can add this code to your replay to see a default avatar head to show where a user had moved (the full body avatars do not currently show in the replay and without this code you would just see the gaze ray and gaze point)

import viz
import vizfx
from sightlab_utils import replay as Replay
from sightlab_utils.replay_settings import *

replay = Replay.SightLabReplay()

#where '1' is the first client, '2', second, etc.   
avatarHead = replay.sceneObjects[AVATAR_HEAD_OBJECT]['1']
glowbot = vizfx.addChild(r'C:\Program Files\WorldViz\Vizard8\bin\lib\site-packages\sightlab_utils\resources\avatar\head\GlowManBlue.osgb')
def replaceAvatarHead():
    viz.link(avatarHead,glowbot)

import vizact
vizact.onupdate(0,replaceAvatarHead)

Example Scripts

(From ExampleScripts folder)

Multi-User Docs

Inspector / Avatar Editing

See this page for how to use Inspector, where you can view avatar textures, scale and morph values

Was this page helpful?