Interpolation of avatar motion
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Virtual Reality (VR) systems enabling task authoring, such as the ’AuthXR’ framework for industrial procedures, often rely on avatar representations whose motion realism significantly impacts user experience and training effectiveness. The original AuthXR system, presented by Ribeiro-Skreinig et al., while effective for capturing expert demonstrations, exhibits limitations in avatar motion fidelity, including abrupt teleportation for navigation, a lack of smooth transitions between recorded motion segments, and the absence of idle behaviours. This thesis addresses these shortcomings by implementing enhanced avatar animation within the AuthXR framework using Unity. We introduce continuous locomotion utilizing Unity’s Navigation System integrated with animation playback, replacing teleportation. Smooth transitions between different motion states (e.g., walking, idle, task execution) are achieved through motion interpolation techniques and blending of recorded VR motion data (HMD/controllers) played back relative to the avatar. Naturalistic idle animations are incorporated to improve perceived realism during inactive periods. The implementation leverages RootMotion’s FinalIK VRIK for robust full-body inverse kinematics from sparse VR tracking data. This work contributes to improving the plausibility and continuity of avatar motion in demonstration-based VR authoring systems, enhancing the potential for effective skill transfer and user immersion.