Neural Photometry-Guided Visual Attribute Transfer Articles uri icon

authors

  • RODRIGUEZ PARDO DEL CASTILLO, JOSE MIGUEL
  • GARCES, ELENA

publication date

  • December 2021

start page

  • 1818

end page

  • 1830

issue

  • 3

volume

  • 29

International Standard Serial Number (ISSN)

  • 1077-2626

abstract

  • We present a deep learning-based method for propagating spatially-varying visual material attributes (e.g., texture maps or image stylizations) to larger samples of the same or similar materials. For training, we leverage images of the material taken under multiple illuminations and a dedicated data augmentation policy, making the transfer robust to novel illumination conditions and affine deformations. Our model relies on a supervised image-to-image translation framework and is agnostic to the transferred domain; we showcase a semantic segmentation, a normal map, and a stylization. Following an image analogies approach, the method only requires the training data to contain the same visual structures as the input guidance. Our approach works at interactive rates, making it suitable for material edit applications. We thoroughly evaluate our learning methodology in a controlled setup providing quantitative measures of performance. Last, we demonstrate that training the model on a single material is enough to generalize to materials of the same type without the need for massive datasets.

subjects

  • Telecommunications

keywords

  • visualization; lighting; training; semantics; image segmentation; image color analysis; geometry