QuestSim: Human Motion Tracking from Sparse Sensors with Simulated Avatars
DescriptionReal-time tracking of human body motion is crucial for interactive and
immersive experiences in AR/VR. However, very limited sensor data about
the body is available from standalone wearable devices such as HMDs (Head
Mounted Devices) or AR glasses. In this work, we present a reinforcement
learning framework that takes in sparse signals from an HMD and two
controllers, and simulates plausible and physically valid full body motions.
Using high quality full body motion as dense supervision during training,
a simple policy network can learn to output appropriate torques for the
character to balance, walk, and jog, while closely following the input signals.
Our results demonstrate surprisingly similar leg motions to ground truth
without any observations of the lower body, even when the input is only
the 6D transformations of the HMD. We also show that a single policy
can be robust to diverse locomotion styles, different body sizes, and novel
Event Type
Technical Communications
Technical Papers
TimeTuesday, 6 December 20222:00pm - 3:30pm KST
Registration Categories