BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Asia/Seoul X-LIC-LOCATION:Asia/Seoul BEGIN:STANDARD TZOFFSETFROM:+0900 TZOFFSETTO:+0900 TZNAME:KST DTSTART:18871231T000000 DTSTART:19881009T020000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20230103T035306Z LOCATION:Auditorium\, Level 5\, West Wing DTSTART;TZID=Asia/Seoul:20221206T100000 DTEND;TZID=Asia/Seoul:20221206T120000 UID:siggraphasia_SIGGRAPH Asia 2022_sess153_papers_255@linklings.com SUMMARY:Force-Aware Interface via Electromyography for Natural VR/AR Inter action DESCRIPTION:Technical Papers\n\nForce-Aware Interface via Electromyography for Natural VR/AR Interaction\n\nZhang, Liang, Chen, Torrens, Atashzar... \n\nWhile tremendous advances in visual and auditory realism have been mad e for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the ga p between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the v irtual environment and generating haptic sensations back to the users. How ever, existing VR/AR solutions either completely ignore the force inputs f rom the users or rely on obtrusive sensing devices that compromise user ex perience.\n\nBy identifying users' muscle activation patterns while engagi ng in VR/AR, we design a learning-based neural interface for natural and i ntuitive force inputs. Specifically, we show that lightweight electromyogr aphy sensors, resting non-invasively on users' forearm skin, inform and es tablish a robust understanding of their complex hand activities. Fuelled b y a neural-network-based model, our interface can decode finger-wise force s in real-time with 3.3% mean error, and generalize to new users with litt le calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffnes s, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimate ly, we envision our findings to push forward research towards more realist ic physicality in future VR/AR.\n\nRegistration Category: FULL ACCESS, EXP ERIENCE PLUS ACCESS, EXPERIENCE ACCESS, TRADE EXHIBITOR\n\nLanguage: ENGLI SH\n\nFormat: IN-PERSON URL:https://sa2022.siggraph.org/en/full-program/?id=papers_255&sess=sess15 3 END:VEVENT END:VCALENDAR