#SIGGRAPHAsia | #SIGGRAPHAsia2022











SIGN IN TO VIEW THIS PRESENTATION Sign In
Televerse: Teleport to the Augmented Real-World with Live Visual Effects

DescriptionCG visual effects (VFX) enables seamless blending between computer generated imagery and recorded real footage. The VFX post-processing augments the live actions films with virtual assets. Recent advancements of real-time technologies actuate the transition from off-line post-production to real-time. Immersive media technologies transform the end-user experience from watching to a high sense of presence within the story. Furthermore, high-speed networking changes the media distribution from a pre-recorded medium to live streaming.
This course introduces live visual effects which augment immersive videos with real-time 3D computer graphics. The new framework introduces a novel live platform allowing the user’s illusion to virtually teleport to the captured real-space. The user is able to interact with scene objects in the video with coherent illumination and blending between the virtual objects and the real background, in real-time.
We discuss a concept for the immersive telepresence, the feeling of “being there”, at the remote real environment. The immersive and interactive media technologies that enhance the remote telepresence is introduced including real-time panoramic/volumetric capturing, immersive media streaming, and real-time 6-DoF navigation.
Inverse rendering to auto estimate scene environment using machine learning is further discussed. Then steps for the real-time VFX is explained; real-time image-based lighting, rendering, and composition for high-fidelity mixed reality scenes.
We showcase a novel system setup, which integrates the core research into a working prototype, and enables the user experience for remote telepresence and telecollaboration. The user study methods to evaluate the user experience is discussed with introduction to the important metrics to measure the impact.
Finally, case-studies with the public end-users is introduced including a livestreaming concert to 2,000 online viewers (controlled by a producer in real-time with augmented live VFX), virtual tourisms, and virtual field trips for remote education. Potential applications and future extensions is further discussed in the session.
This course introduces live visual effects which augment immersive videos with real-time 3D computer graphics. The new framework introduces a novel live platform allowing the user’s illusion to virtually teleport to the captured real-space. The user is able to interact with scene objects in the video with coherent illumination and blending between the virtual objects and the real background, in real-time.
We discuss a concept for the immersive telepresence, the feeling of “being there”, at the remote real environment. The immersive and interactive media technologies that enhance the remote telepresence is introduced including real-time panoramic/volumetric capturing, immersive media streaming, and real-time 6-DoF navigation.
Inverse rendering to auto estimate scene environment using machine learning is further discussed. Then steps for the real-time VFX is explained; real-time image-based lighting, rendering, and composition for high-fidelity mixed reality scenes.
We showcase a novel system setup, which integrates the core research into a working prototype, and enables the user experience for remote telepresence and telecollaboration. The user study methods to evaluate the user experience is discussed with introduction to the important metrics to measure the impact.
Finally, case-studies with the public end-users is introduced including a livestreaming concert to 2,000 online viewers (controlled by a producer in real-time with augmented live VFX), virtual tourisms, and virtual field trips for remote education. Potential applications and future extensions is further discussed in the session.
Event Type
Courses
TimeWednesday, 7 December 202211:00am - 12:45pm KST
LocationRoom 322, Level 3, West Wing



