dc.contributor.author | Gao, Qiqi | en_US |
dc.contributor.author | Taketomi, Takafumi | en_US |
dc.contributor.editor | Hauser, Helwig and Alliez, Pierre | en_US |
dc.date.accessioned | 2023-10-06T11:58:50Z | |
dc.date.available | 2023-10-06T11:58:50Z | |
dc.date.issued | 2023 | |
dc.identifier.issn | 1467-8659 | |
dc.identifier.uri | https://doi.org/10.1111/cgf.14804 | |
dc.identifier.uri | https://diglib.eg.org:443/handle/10.1111/cgf14804 | |
dc.description.abstract | Modelling garments with rich details require enormous time and expertise of artists. Recent works re‐construct garments through segmentation of clothed human scan. However, existing methods rely on certain human body templates and do not perform as well on loose garments such as skirts. This paper presents a two‐stage pipeline for extracting high‐fidelity garments from static scan data of clothed mannequins. Our key contribution is a novel method for tracking both tight and loose boundaries between garments and mannequin skin. Our algorithm enables the modelling of off‐the‐shelf clothing with fine details. It is independent of human template models and requires only minimal mannequin priors. The effectiveness of our method is validated through quantitative and qualitative comparison with the baseline method. The results demonstrate that our method can accurately extract both tight and loose garments within reasonable time. | en_US |
dc.publisher | © 2023 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd. | en_US |
dc.subject | cloth modelling | |
dc.subject | digital geometry processing | |
dc.subject | mesh segmentation | |
dc.subject | modelling | |
dc.title | Garment Model Extraction from Clothed Mannequin Scan | en_US |
dc.description.seriesinformation | Computer Graphics Forum | |
dc.description.sectionheaders | ORIGINAL ARTICLES | |
dc.description.volume | 42 | |
dc.description.number | 6 | |
dc.identifier.doi | 10.1111/cgf.14804 | |