BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Asia/Seoul X-LIC-LOCATION:Asia/Seoul BEGIN:STANDARD TZOFFSETFROM:+0900 TZOFFSETTO:+0900 TZNAME:KST DTSTART:18871231T000000 DTSTART:19881009T020000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20230103T035307Z LOCATION:Room 324\, Level 3\, West Wing DTSTART;TZID=Asia/Seoul:20221206T140000 DTEND;TZID=Asia/Seoul:20221206T153000 UID:siggraphasia_SIGGRAPH Asia 2022_sess154_papers_431@linklings.com SUMMARY:QuestSim: Human Motion Tracking from Sparse Sensors with Simulated Avatars DESCRIPTION:Technical Communications, Technical Papers\n\nQuestSim: Human Motion Tracking from Sparse Sensors with Simulated Avatars\n\nWinkler, Won , Ye\n\nReal-time tracking of human body motion is crucial for interactive and\nimmersive experiences in AR/VR. However, very limited sensor data ab out\nthe body is available from standalone wearable devices such as HMDs ( Head\nMounted Devices) or AR glasses. In this work, we present a reinforce ment\nlearning framework that takes in sparse signals from an HMD and two\ ncontrollers, and simulates plausible and physically valid full body motio ns.\nUsing high quality full body motion as dense supervision during train ing,\na simple policy network can learn to output appropriate torques for the\ncharacter to balance, walk, and jog, while closely following the inpu t signals.\nOur results demonstrate surprisingly similar leg motions to gr ound truth\nwithout any observations of the lower body, even when the inpu t is only\nthe 6D transformations of the HMD. We also show that a single p olicy\ncan be robust to diverse locomotion styles, different body sizes, a nd novel\nenvironments.\n\nRegistration Category: FULL ACCESS, ON-DEMAND A CCESS\n\nLanguage: ENGLISH\n\nFormat: IN-PERSON, ON-DEMAND URL:https://sa2022.siggraph.org/en/full-program/?id=papers_431&sess=sess15 4 END:VEVENT END:VCALENDAR