BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Asia/Seoul X-LIC-LOCATION:Asia/Seoul BEGIN:STANDARD TZOFFSETFROM:+0900 TZOFFSETTO:+0900 TZNAME:KST DTSTART:18871231T000000 DTSTART:19881009T020000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20230103T035312Z LOCATION:Room 325-AB\, Level 3\, West Wing DTSTART;TZID=Asia/Seoul:20221209T140000 DTEND;TZID=Asia/Seoul:20221209T153000 UID:siggraphasia_SIGGRAPH Asia 2022_sess178_papers_255@linklings.com SUMMARY:Force-Aware Interface via Electromyography for Natural VR/AR Inter action DESCRIPTION:Technical Communications, Technical Papers\n\nForce-Aware Inte rface via Electromyography for Natural VR/AR Interaction\n\nZhang, Liang, Chen, Torrens, Atashzar...\n\nWhile tremendous advances in visual and audi tory realism have been made for virtual and augmented reality (VR/AR), int roducing a plausible sense of physicality into the virtual world remains c hallenging. Closing the gap between real-world physicality and immersive v irtual experience requires a closed interaction loop: applying user-exerte d physical forces to the virtual environment and generating haptic sensati ons back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devic es that compromise user experience.\n\nBy identifying users' muscle activa tion patterns while engaging in VR/AR, we design a learning-based neural i nterface for natural and intuitive force inputs. Specifically, we show tha t lightweight electromyography sensors, resting non-invasively on users' f orearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface ca n decode finger-wise forces in real-time with 3.3% mean error, and general ize to new users with little calibration. Through an interactive psychophy sical study, we show that human perception of virtual objects' physical pr operties, such as stiffness, can be significantly enhanced by our interfac e. We further demonstrate that our interface enables ubiquitous control vi a finger tapping. Ultimately, we envision our findings to push forward res earch towards more realistic physicality in future VR/AR.\n\nRegistration Category: FULL ACCESS, ON-DEMAND ACCESS\n\nLanguage: ENGLISH\n\nFormat: IN -PERSON, ON-DEMAND URL:https://sa2022.siggraph.org/en/full-program/?id=papers_255&sess=sess17 8 END:VEVENT END:VCALENDAR