#SIGGRAPHAsia | #SIGGRAPHAsia2022
· Contributors · Organizations ·
SIGN IN TO VIEW THIS PRESENTATION Sign In
Neural Parameterization for Dynamic Human Head Editing
DescriptionImplicit radiance functions emerged as a powerful scene representation for reconstructing and rendering photo-realistic views of a 3D scene. These representations, however, suffer from poor editability. On the other hand, explicit representations such as polygonal meshes allow easy editing, but are not as suitable for reconstructing accurate details in dynamic human heads, such as fine facial features, hair, and eyes. In this work, we present Neural Parameterization (NeP), a hybrid representation that provides the advantages of both implicit and explicit methods. NeP is capable of photo-realistic rendering while allowing fine-grained editing of the scene geometry and appearance. We first disentangle the geometry and appearance by parameterizing the 3D geometry into 2D texture space. We enable geometric editability by introducing an explicit linear deformation blending layer. The deformation is controlled by a set of sparse key points which can be explicitly and intuitively displaced to edit the geometry. For appearance, we develop a hybrid 2D texture consisting of an explicit texture map for easy editing and implicit view and time-dependent residuals to model temporal and view variations. We compare our method to several reconstruction and editing baselines. The results show that the NeP achieves almost the same level of rendering accuracy while maintaining high editability.
TimeThursday, 8 December 20222:00pm - 3:30pm KST
LocationRoom 325-AB, Level 3, West Wing