Pattern-Based Cloth Registration and Sparse-View Animation
DescriptionWe propose a novel multi-view camera pipeline for the reconstruction and registration of dynamic clothing.
Our proposed method relies on a specifically designed pattern that allows for precise video tracking in each camera view.
We triangulate the tracked points and register the cloth surface in a fine-grained geometric resolution and low localization error.
Compared to state-of-the-art methods, our registration exhibits stable correspondence, tracking the same points on the deforming cloth surface along the temporal sequence.
As an application, we demonstrate how the use of our registration pipeline greatly improves state-of-the-art pose-based drivable cloth models.
Furthermore, we propose a novel model, Garment Avatar, for driving cloth from a dense tracking signal which is obtained from two opposing camera views. The method produces realistic reconstructions which are faithful to the actual geometry of the deforming cloth.
In this setting, the user wears a garment with our custom pattern which enables our driving model to reconstruct the geometry. We will release our pattern and registered mesh sequences containing 4 different subjects, and 15k frames in total.
Event Type
Technical Papers
TimeWednesday, 7 December 202211:00am - 12:30pm KST
Languages
Formats
Registration Categories