BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Asia/Seoul X-LIC-LOCATION:Asia/Seoul BEGIN:STANDARD TZOFFSETFROM:+0900 TZOFFSETTO:+0900 TZNAME:KST DTSTART:18871231T000000 DTSTART:19881009T020000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20230103T035308Z LOCATION:Room 325-AB\, Level 3\, West Wing DTSTART;TZID=Asia/Seoul:20221207T110000 DTEND;TZID=Asia/Seoul:20221207T123000 UID:siggraphasia_SIGGRAPH Asia 2022_sess161_papers_222@linklings.com SUMMARY:Efficient Light Probes for Real-time Global Illumination DESCRIPTION:Technical Communications, Technical Papers\n\nEfficient Light Probes for Real-time Global Illumination\n\nGuo, Zong, Song, Fu, Tao...\n\ nReproducing physically-based global illumination (GI) effects has been a long-standing demand for many real-time graphical applications. In pursuit of this goal, many recent engines resort to some form of light probes bak ed in a precomputation stage. Unfortunately, the GI effects stemming from the precomputed probes are rather limited due to the constraints in the pr obe storage, representation or query. In this paper, we propose a neural m ethod for probe-based GI rendering which can generate a wide range of GI e ffects, including glossy reflection with multiple bounces, in complex scen es. The key contributions behind our work include a gradient-based search algorithm and a neural image reconstruction method. The search algorithm i s designed to reproject the probes' contents to any query viewpoint, witho ut introducing parallax errors, and converges fast to the optimal solution . The neural image reconstruction method, based on a dedicated neural netw ork and several G-buffers, tries to recover high-quality images from low-q uality inputs due to limited resolution or (potential) low sampling rate o f the probes. This neural method makes the generation of light probes effi cient. Moreover, a temporal reprojection strategy and a temporal loss are employed to improve temporal stability for animation sequences. The whole pipeline runs in real-time (>30 frames per second) even for high-resolutio n (1920x1080) outputs, thanks to the fast convergence rate of the gradient -based search algorithm and a light-weight design of the neural network. E xtensive experiments on multiple complex scenes have been conducted to sho w the superiority of our method over the state-of-the-arts.\n\nRegistratio n Category: FULL ACCESS, ON-DEMAND ACCESS\n\nLanguage: ENGLISH\n\nFormat: IN-PERSON, ON-DEMAND URL:https://sa2022.siggraph.org/en/full-program/?id=papers_222&sess=sess16 1 END:VEVENT END:VCALENDAR