Data-driven Approaches for Interactive Appearance Editing
View/ Open
Date
2015-06-22Author
Nguyen, Chuong H.
Item/paper (currently) not available via TIB Hannover.
Metadata
Show full item recordAbstract
This thesis proposes several techniques for interactive editing of digital content and fast
rendering of virtual 3D scenes. Editing of digital content - such as images or 3D scenes
- is difficult, requires artistic talent and technical expertise. To alleviate these difficulties,
we exploit data-driven approaches that use the easily accessible Internet data (e. g., images,
videos, materials) to develop new tools for digital content manipulation. Our proposed
techniques allow casual users to achieve high-quality editing by interactively exploring the
manipulations without the need to understand the underlying physical models of appearance.
First, the thesis presents a fast algorithm for realistic image synthesis of virtual 3D scenes.
This serves as the core framework for a new method that allows artists to fine tune the
appearance of a rendered 3D scene. Here, artists directly paint the final appearance and the
system automatically solves for the material parameters that best match the desired look.
Along this line, an example-based material assignment approach is proposed, where the
3D models of a virtual scene can be "materialized" simply by giving a guidance source
(image/video). Next, the thesis proposes shape and color subspaces of an object that are
learned from a collection of exemplar images. These subspaces can be used to constrain
image manipulations to valid shapes and colors, or provide suggestions for manipulations.
Finally, data-driven color manifolds which contain colors of a specific context are proposed.
Such color manifolds can be used to improve color picking performance, color stylization,
compression or white balancing.