BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Asia/Seoul X-LIC-LOCATION:Asia/Seoul BEGIN:STANDARD TZOFFSETFROM:+0900 TZOFFSETTO:+0900 TZNAME:KST DTSTART:18871231T000000 DTSTART:19881009T020000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20230103T035307Z LOCATION:Auditorium\, Level 5\, West Wing DTSTART;TZID=Asia/Seoul:20221206T100000 DTEND;TZID=Asia/Seoul:20221206T120000 UID:siggraphasia_SIGGRAPH Asia 2022_sess153_papers_431@linklings.com SUMMARY:QuestSim: Human Motion Tracking from Sparse Sensors with Simulated Avatars DESCRIPTION:Technical Papers\n\nQuestSim: Human Motion Tracking from Spars e Sensors with Simulated Avatars\n\nWinkler, Won, Ye\n\nReal-time tracking of human body motion is crucial for interactive and\nimmersive experience s in AR/VR. However, very limited sensor data about\nthe body is available from standalone wearable devices such as HMDs (Head\nMounted Devices) or AR glasses. In this work, we present a reinforcement\nlearning framework t hat takes in sparse signals from an HMD and two\ncontrollers, and simulate s plausible and physically valid full body motions.\nUsing high quality fu ll body motion as dense supervision during training,\na simple policy netw ork can learn to output appropriate torques for the\ncharacter to balance, walk, and jog, while closely following the input signals.\nOur results de monstrate surprisingly similar leg motions to ground truth\nwithout any ob servations of the lower body, even when the input is only\nthe 6D transfor mations of the HMD. We also show that a single policy\ncan be robust to di verse locomotion styles, different body sizes, and novel\nenvironments.\n\ nRegistration Category: FULL ACCESS, EXPERIENCE PLUS ACCESS, EXPERIENCE AC CESS, TRADE EXHIBITOR\n\nLanguage: ENGLISH\n\nFormat: IN-PERSON URL:https://sa2022.siggraph.org/en/full-program/?id=papers_431&sess=sess15 3 END:VEVENT END:VCALENDAR