On Demand Solid Texture Synthesis Using Deep 3D Networks
View/ Open
Date
2020Author
Gutierrez, J.
Rabin, J.
Galerne, B.
Hurtut, T.
Metadata
Show full item recordAbstract
This paper describes a novel approach for on demand volumetric texture synthesis based on a deep learning framework that allows for the generation of high‐quality three‐dimensional (3D) data at interactive rates. Based on a few example images of textures, a generative network is trained to synthesize coherent portions of solid textures of arbitrary sizes that reproduce the visual characteristics of the examples along some directions. To cope with memory limitations and computation complexity that are inherent to both high resolution and 3D processing on the GPU, only 2D textures referred to as ‘slices’ are generated during the training stage. These synthetic textures are compared to exemplar images a perceptual loss function based on a pre‐trained deep network. The proposed network is very light (less than 100k parameters), therefore it only requires sustainable training (i.e. few hours) and is capable of very fast generation (around a second for 256 voxels) on a single GPU. Integrated with a spatially seeded pseudo‐random number generator (PRNG) the proposed generator network directly returns a color value given a set of 3D coordinates. The synthesized volumes have good visual results that are at least equivalent to the state‐of‐the‐art patch‐based approaches. They are naturally seamlessly tileable and can be fully generated in parallel.
BibTeX
@article {10.1111:cgf.13889,
journal = {Computer Graphics Forum},
title = {{On Demand Solid Texture Synthesis Using Deep 3D Networks}},
author = {Gutierrez, J. and Rabin, J. and Galerne, B. and Hurtut, T.},
year = {2020},
publisher = {© 2020 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13889}
}
journal = {Computer Graphics Forum},
title = {{On Demand Solid Texture Synthesis Using Deep 3D Networks}},
author = {Gutierrez, J. and Rabin, J. and Galerne, B. and Hurtut, T.},
year = {2020},
publisher = {© 2020 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd},
ISSN = {1467-8659},
DOI = {10.1111/cgf.13889}
}
Collections
Related items
Showing items related by title, author, creator and subject.
-
MoMaS: Mold Manifold Simulation for Real-time Procedural Texturing
Maggioli, Filippo; Marin, Riccardo; Melzi, Simone; Rodolà, Emanuele (The Eurographics Association and John Wiley & Sons Ltd., 2022)The slime mold algorithm has recently been under the spotlight thanks to its compelling properties studied across many disciplines like biology, computation theory, and artificial intelligence. However, existing implementations ... -
Htex: Per-Halfedge Texturing for Arbitrary Mesh Topologies
Barbier, Wilhem; Dupuy, Jonathan (ACM Association for Computing Machinery, 2022)We introduce per-halfedge texturing (Htex) a GPU-friendly method for texturing arbitrary polygon-meshes without an explicit parameterization. Htex builds upon the insight that halfedges encode an intrinsic triangulation ... -
Creating 3D Asset Variations Through 2D Style Transfer and Generated Texture Maps
Nikolov, Ivan (The Eurographics Association, 2023)Generating 3D object variations through style transfer models applied to their textures is an easy way for creating content for games and XR applications. Most workflows focus on either generating albedo textures only ...