dc.contributor.author | Martin, Tobias | en_US |
dc.contributor.author | Joshi, Pushkar | en_US |
dc.contributor.author | Bergou, Miklós | en_US |
dc.contributor.author | Carr, Nathan | en_US |
dc.contributor.editor | Holly Rushmeier and Oliver Deussen | en_US |
dc.date.accessioned | 2015-02-28T16:07:14Z | |
dc.date.available | 2015-02-28T16:07:14Z | |
dc.date.issued | 2013 | en_US |
dc.identifier.issn | 1467-8659 | en_US |
dc.identifier.uri | http://dx.doi.org/10.1111/cgf.12019 | en_US |
dc.description.abstract | We present a method for accelerating the convergence of continuous non‐linear shape optimization algorithms. We start with a general method for constructing gradient vector fields on a manifold, and we analyse this method from a signal processing viewpoint. This analysis reveals that we can construct various filters using the Laplace–Beltrami operator of the shape that can effectively separate the components of the gradient at different scales. We use this idea to adaptively change the scale of features being optimized to arrive at a solution that is optimal across multiple scales. This is in contrast to traditional descent‐based methods, for which the rate of convergence often stalls early once the high frequency components have been optimized. We demonstrate how our method can be easily integrated into existing non‐linear optimization frameworks such as gradient descent, Broyden–Fletcher–Goldfarb–Shanno (BFGS) and the non‐linear conjugate gradient method. We show significant performance improvement for shape optimization in variational shape modelling and parameterization, and we also demonstrate the use of our method for efficient physical simulation.We present a method for accelerating the convergence of continuous nonlinear shape optimization algorithms. We start with a general method for constructing gradient vector fields on a manifold, and we analyze this method from a signal processing viewpoint. This analysis reveals that we can construct various filters using the Laplace‐Beltrami operator of the shape that can effectively separate the components of the gradient at different scales. We use this idea to adaptively change the scale of features being optimized in order to arrive at a solution that is optimal across multiple scales. | en_US |
dc.publisher | The Eurographics Association and Blackwell Publishing Ltd. | en_US |
dc.subject | nonlinear optimization | en_US |
dc.subject | gradient preconditioning | en_US |
dc.subject | geometric flow | en_US |
dc.subject | G.1.5 | en_US |
dc.subject | Roots of Nonlinear Equations | en_US |
dc.subject | G.1.6 | en_US |
dc.subject | Optimization | en_US |
dc.subject | I.3.5 | en_US |
dc.subject | Computational Geometry and Object Modeling | en_US |
dc.title | Efficient Non‐linear Optimization via Multi‐scale Gradient Filtering | en_US |
dc.description.seriesinformation | Computer Graphics Forum | en_US |
dc.description.volume | 32 | |
dc.description.number | 6 | |